Mar 19 09:22:23 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 09:22:23 crc restorecon[4693]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:23 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 09:22:24 crc restorecon[4693]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 09:22:26 crc kubenswrapper[4835]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.029528 4835 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.047953 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048001 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048012 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048021 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048030 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048040 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048048 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048056 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048064 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048074 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048089 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048100 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048112 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048124 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048133 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048144 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048155 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048166 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048176 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048187 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048196 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048204 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048213 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048225 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048239 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048252 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048263 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048273 4835 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048283 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048292 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048305 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048315 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048324 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048334 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048362 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048372 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048382 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048392 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048401 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048447 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048459 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048471 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048482 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048493 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048506 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048523 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048535 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048546 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048556 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048566 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048576 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048589 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048602 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048614 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048624 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048632 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048641 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048650 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048658 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048667 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048675 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048683 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048691 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048698 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048706 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048714 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048721 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048729 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048776 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048784 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.048792 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.048937 4835 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.048954 4835 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.048970 4835 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.048981 4835 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.048994 4835 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049004 4835 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049017 4835 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049028 4835 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049037 4835 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049047 4835 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049056 4835 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049067 4835 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049077 4835 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049085 4835 flags.go:64] FLAG: --cgroup-root="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049094 4835 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049105 4835 flags.go:64] FLAG: --client-ca-file="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049116 4835 flags.go:64] FLAG: --cloud-config="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049128 4835 flags.go:64] FLAG: --cloud-provider="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049139 4835 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049152 4835 flags.go:64] FLAG: --cluster-domain="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049164 4835 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049176 4835 flags.go:64] FLAG: --config-dir="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049189 4835 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049202 4835 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049230 4835 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049242 4835 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049253 4835 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049265 4835 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049276 4835 flags.go:64] FLAG: --contention-profiling="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049288 4835 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049299 4835 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049310 4835 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049321 4835 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049335 4835 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049347 4835 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049357 4835 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049370 4835 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049381 4835 flags.go:64] FLAG: --enable-server="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049393 4835 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049409 4835 flags.go:64] FLAG: --event-burst="100" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049421 4835 flags.go:64] FLAG: --event-qps="50" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049432 4835 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049444 4835 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049455 4835 flags.go:64] FLAG: --eviction-hard="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049470 4835 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049481 4835 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049493 4835 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049506 4835 flags.go:64] FLAG: --eviction-soft="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049516 4835 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049525 4835 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049534 4835 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049543 4835 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049552 4835 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049561 4835 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049570 4835 flags.go:64] FLAG: --feature-gates="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049581 4835 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049590 4835 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049599 4835 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049608 4835 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049618 4835 flags.go:64] FLAG: --healthz-port="10248" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049627 4835 flags.go:64] FLAG: --help="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049636 4835 flags.go:64] FLAG: --hostname-override="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049644 4835 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049654 4835 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049663 4835 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049672 4835 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049681 4835 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049689 4835 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049698 4835 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049707 4835 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049716 4835 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049726 4835 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049735 4835 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049778 4835 flags.go:64] FLAG: --kube-reserved="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049787 4835 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049796 4835 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049806 4835 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049815 4835 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049823 4835 flags.go:64] FLAG: --lock-file="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049832 4835 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049841 4835 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049850 4835 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049863 4835 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049873 4835 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049883 4835 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049892 4835 flags.go:64] FLAG: --logging-format="text" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049901 4835 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049911 4835 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049919 4835 flags.go:64] FLAG: --manifest-url="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049928 4835 flags.go:64] FLAG: --manifest-url-header="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049940 4835 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049949 4835 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049960 4835 flags.go:64] FLAG: --max-pods="110" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049969 4835 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049978 4835 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049987 4835 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.049997 4835 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050006 4835 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050015 4835 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050024 4835 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050045 4835 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050055 4835 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050064 4835 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050073 4835 flags.go:64] FLAG: --pod-cidr="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050082 4835 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050094 4835 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050103 4835 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050113 4835 flags.go:64] FLAG: --pods-per-core="0" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050122 4835 flags.go:64] FLAG: --port="10250" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050131 4835 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050140 4835 flags.go:64] FLAG: --provider-id="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050149 4835 flags.go:64] FLAG: --qos-reserved="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050158 4835 flags.go:64] FLAG: --read-only-port="10255" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050167 4835 flags.go:64] FLAG: --register-node="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050176 4835 flags.go:64] FLAG: --register-schedulable="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050184 4835 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050200 4835 flags.go:64] FLAG: --registry-burst="10" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050209 4835 flags.go:64] FLAG: --registry-qps="5" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050218 4835 flags.go:64] FLAG: --reserved-cpus="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050228 4835 flags.go:64] FLAG: --reserved-memory="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050239 4835 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050249 4835 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050258 4835 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050267 4835 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050275 4835 flags.go:64] FLAG: --runonce="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050284 4835 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050294 4835 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050303 4835 flags.go:64] FLAG: --seccomp-default="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050312 4835 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050321 4835 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050330 4835 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050339 4835 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050349 4835 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050363 4835 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050372 4835 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050382 4835 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050391 4835 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050400 4835 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050409 4835 flags.go:64] FLAG: --system-cgroups="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050418 4835 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050433 4835 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050443 4835 flags.go:64] FLAG: --tls-cert-file="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050454 4835 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050469 4835 flags.go:64] FLAG: --tls-min-version="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050480 4835 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050491 4835 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050503 4835 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050515 4835 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050526 4835 flags.go:64] FLAG: --v="2" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050553 4835 flags.go:64] FLAG: --version="false" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050567 4835 flags.go:64] FLAG: --vmodule="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050581 4835 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.050592 4835 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050836 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050848 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050858 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050867 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050876 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050886 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050894 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050902 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050910 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050919 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050929 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050937 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050948 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050956 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050966 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050976 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050985 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.050993 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051001 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051009 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051017 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051025 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051034 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051043 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051051 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051059 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051068 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051075 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051083 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051092 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051101 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051112 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051123 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051136 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051147 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051157 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051167 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051177 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051191 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051203 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051215 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051227 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051237 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051248 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051263 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051275 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051285 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051295 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051305 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051315 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051328 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051340 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051371 4835 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051381 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051392 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051402 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051412 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051421 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051431 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051441 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051449 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051457 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051465 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051473 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051485 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051492 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051500 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051508 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051516 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051524 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.051531 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.051545 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.067730 4835 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.067813 4835 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.067954 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.067967 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.067978 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.067987 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.067995 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068004 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068015 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068026 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068038 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068050 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068059 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068068 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068076 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068085 4835 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068094 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068103 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068111 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068120 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068128 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068141 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068153 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068162 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068172 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068181 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068189 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068198 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068207 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068215 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068223 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068232 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068241 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068250 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068259 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068267 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068278 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068286 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068295 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068304 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068312 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068320 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068329 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068340 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068349 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068358 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068366 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068377 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068388 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068398 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068407 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068416 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068425 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068434 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068444 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068453 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068462 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068470 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068479 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068487 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068496 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068504 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068512 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068521 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068530 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068539 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068547 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068556 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068564 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068572 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068581 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068589 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068600 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.068615 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068876 4835 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068892 4835 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068903 4835 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068912 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068920 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068930 4835 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068938 4835 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068946 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068957 4835 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068965 4835 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068973 4835 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068982 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068990 4835 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.068999 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069007 4835 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069015 4835 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069025 4835 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069036 4835 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069046 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069056 4835 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069064 4835 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069073 4835 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069082 4835 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069093 4835 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069104 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069113 4835 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069123 4835 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069132 4835 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069143 4835 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069154 4835 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069164 4835 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069173 4835 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069182 4835 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069191 4835 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069201 4835 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069210 4835 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069218 4835 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069227 4835 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069235 4835 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069243 4835 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069251 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069260 4835 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069269 4835 feature_gate.go:330] unrecognized feature gate: Example Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069277 4835 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069285 4835 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069293 4835 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069301 4835 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069313 4835 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069323 4835 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069332 4835 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069340 4835 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069348 4835 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069357 4835 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069365 4835 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069374 4835 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069382 4835 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069392 4835 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069401 4835 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069409 4835 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069418 4835 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069426 4835 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069434 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069443 4835 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069452 4835 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069461 4835 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069470 4835 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069478 4835 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069486 4835 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069495 4835 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069504 4835 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.069516 4835 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.069530 4835 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.069826 4835 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.115549 4835 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.129088 4835 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.129277 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.137719 4835 server.go:997] "Starting client certificate rotation" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.137785 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.138062 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.176118 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.178818 4835 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.181474 4835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.201554 4835 log.go:25] "Validated CRI v1 runtime API" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.251483 4835 log.go:25] "Validated CRI v1 image API" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.254538 4835 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.265071 4835 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-09-17-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.265133 4835 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.291524 4835 manager.go:217] Machine: {Timestamp:2026-03-19 09:22:26.288903872 +0000 UTC m=+1.137502529 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:018fc9bf-6313-48f6-b70c-1716ce86e066 BootID:d455f31a-96a2-4159-bc94-bb9403ca471c Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fd:f1:62 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:fd:f1:62 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a3:7f:a6 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ac:3a:9f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8b:54:7d Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f1:87:5d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:62:3c:e5:5d:14 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:9d:cb:bb:07:92 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.291944 4835 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.292134 4835 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.294594 4835 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.295111 4835 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.295171 4835 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.295539 4835 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.295558 4835 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.296134 4835 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.296169 4835 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.296384 4835 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.296521 4835 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.303940 4835 kubelet.go:418] "Attempting to sync node with API server" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.303978 4835 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.304024 4835 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.304044 4835 kubelet.go:324] "Adding apiserver pod source" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.304064 4835 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.310530 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.310614 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.311338 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.311389 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.314236 4835 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.315314 4835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.317361 4835 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322279 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322369 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322389 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322406 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322429 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322443 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322459 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322480 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322497 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322513 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322531 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.322543 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.325267 4835 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.326155 4835 server.go:1280] "Started kubelet" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.326385 4835 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.327240 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.326680 4835 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 09:22:26 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.335241 4835 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.337717 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.337800 4835 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.337925 4835 server.go:460] "Adding debug handlers to kubelet server" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.338374 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.339156 4835 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.339196 4835 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.339221 4835 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.340029 4835 factory.go:55] Registering systemd factory Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.340052 4835 factory.go:221] Registration of the systemd container factory successfully Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.340076 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="200ms" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.340080 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.340182 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.348081 4835 factory.go:153] Registering CRI-O factory Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.348444 4835 factory.go:221] Registration of the crio container factory successfully Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.348805 4835 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.349033 4835 factory.go:103] Registering Raw factory Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.349242 4835 manager.go:1196] Started watching for new ooms in manager Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.350530 4835 manager.go:319] Starting recovery of all containers Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355128 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355224 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355244 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355261 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355276 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355292 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355309 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355324 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355341 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355358 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355374 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355390 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355408 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355427 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355444 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355461 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355477 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355493 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355511 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355528 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355544 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355559 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355575 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355592 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355610 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355626 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355645 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355664 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355704 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355721 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355760 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355779 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.352873 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.116:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355815 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355833 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355847 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355862 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355878 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355894 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355910 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355927 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355942 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355959 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.355998 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356016 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356031 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356046 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356063 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356079 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356094 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356152 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356171 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356190 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356214 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356232 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356249 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356267 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356285 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356301 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356317 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356333 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356349 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356365 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.356381 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360610 4835 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360650 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360665 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360679 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360693 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360711 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360725 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360767 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360787 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360804 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360820 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360837 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360854 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360870 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360886 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360907 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360921 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360937 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360951 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360963 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.360975 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361012 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361026 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361042 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361055 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361069 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361081 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361095 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361107 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361119 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361132 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361145 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361158 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361172 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361185 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361199 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361211 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361225 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361239 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361251 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361264 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361278 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361296 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361311 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361325 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361338 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361353 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361366 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361381 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361395 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361409 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361423 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361437 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361451 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361464 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361475 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361486 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361499 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361513 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361526 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361539 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361552 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361565 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361577 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361589 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361602 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361614 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361626 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361638 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361652 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361665 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361676 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361689 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361706 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361718 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361754 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361772 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361789 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361804 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361816 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361829 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361842 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361854 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361867 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361880 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361894 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361922 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361934 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361953 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361965 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361977 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.361992 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362004 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362015 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362028 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362040 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362051 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362065 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362078 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362089 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362103 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362118 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362130 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362144 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362157 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362173 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362186 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362199 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362211 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362224 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362238 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362253 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362266 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362279 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362290 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362303 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362318 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362331 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362342 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362355 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362367 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362379 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362392 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362406 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362418 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362430 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362445 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362458 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362471 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362483 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362496 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362508 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362520 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362535 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362546 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362558 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362569 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362585 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362597 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362611 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362623 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362636 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362649 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362661 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362676 4835 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362688 4835 reconstruct.go:97] "Volume reconstruction finished" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.362696 4835 reconciler.go:26] "Reconciler: start to sync state" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.383194 4835 manager.go:324] Recovery completed Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.393022 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.394279 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.394321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.394330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.395456 4835 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.395472 4835 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.395509 4835 state_mem.go:36] "Initialized new in-memory state store" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.398894 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.400400 4835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.400451 4835 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.400661 4835 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.400713 4835 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 09:22:26 crc kubenswrapper[4835]: W0319 09:22:26.422996 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.423043 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.438473 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.475390 4835 policy_none.go:49] "None policy: Start" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.476674 4835 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.476706 4835 state_mem.go:35] "Initializing new in-memory state store" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.501192 4835 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.539117 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.540901 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="400ms" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.587699 4835 manager.go:334] "Starting Device Plugin manager" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.588008 4835 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.588037 4835 server.go:79] "Starting device plugin registration server" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.588522 4835 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.589059 4835 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.592805 4835 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.593195 4835 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.593366 4835 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.600846 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.692812 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.694766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.694816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.694830 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.694865 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.695395 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.116:6443: connect: connection refused" node="crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.701356 4835 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.701458 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.702785 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.702822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.702839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703002 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703282 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703338 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703763 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703775 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.703934 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.704271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.704400 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.704440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.704496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.704513 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705451 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705560 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705755 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.705920 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.706134 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.706185 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707705 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707730 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707769 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707734 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.707969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.708400 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.708473 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.709121 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.710811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.710960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.711063 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.711329 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.711454 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712322 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.712389 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.766791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767296 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767545 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767704 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767856 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.767999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.768029 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.768066 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.768098 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.768128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.869996 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870139 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870168 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870233 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870304 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870345 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870397 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870482 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870510 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870538 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870981 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.870920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871088 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871052 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871013 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871123 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871071 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871025 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871204 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871026 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.871239 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.896500 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.898011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.898046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.898056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:26 crc kubenswrapper[4835]: I0319 09:22:26.898088 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.898545 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.116:6443: connect: connection refused" node="crc" Mar 19 09:22:26 crc kubenswrapper[4835]: E0319 09:22:26.941524 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="800ms" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.035601 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.060064 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.084449 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.085913 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b12af52532fd090dce1e07768bc51ac730d4ee55b800b447c67c58cf09519131 WatchSource:0}: Error finding container b12af52532fd090dce1e07768bc51ac730d4ee55b800b447c67c58cf09519131: Status 404 returned error can't find the container with id b12af52532fd090dce1e07768bc51ac730d4ee55b800b447c67c58cf09519131 Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.098104 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-657e453b7b3997380485eef69e83a7be5b10299cd1c653476d3dbffabaf3e4a8 WatchSource:0}: Error finding container 657e453b7b3997380485eef69e83a7be5b10299cd1c653476d3dbffabaf3e4a8: Status 404 returned error can't find the container with id 657e453b7b3997380485eef69e83a7be5b10299cd1c653476d3dbffabaf3e4a8 Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.102885 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.110952 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.111221 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-27c3a7839082acaff8b493856162335c2491081354e134a4aa7a80f8e5c376c5 WatchSource:0}: Error finding container 27c3a7839082acaff8b493856162335c2491081354e134a4aa7a80f8e5c376c5: Status 404 returned error can't find the container with id 27c3a7839082acaff8b493856162335c2491081354e134a4aa7a80f8e5c376c5 Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.145397 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-bc4832dc7fe6122b35df3a45aa7aa3e555b74110941802c77590e3684f08c873 WatchSource:0}: Error finding container bc4832dc7fe6122b35df3a45aa7aa3e555b74110941802c77590e3684f08c873: Status 404 returned error can't find the container with id bc4832dc7fe6122b35df3a45aa7aa3e555b74110941802c77590e3684f08c873 Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.149118 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e3d102b541ac39415b908736c2bfb9db55c160b0add3d98567de3e181d5d9576 WatchSource:0}: Error finding container e3d102b541ac39415b908736c2bfb9db55c160b0add3d98567de3e181d5d9576: Status 404 returned error can't find the container with id e3d102b541ac39415b908736c2bfb9db55c160b0add3d98567de3e181d5d9576 Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.217214 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.217297 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.235101 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.235184 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.299055 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.300541 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.300588 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.300603 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.300634 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.301159 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.116:6443: connect: connection refused" node="crc" Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.328326 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.404358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"657e453b7b3997380485eef69e83a7be5b10299cd1c653476d3dbffabaf3e4a8"} Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.406410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b12af52532fd090dce1e07768bc51ac730d4ee55b800b447c67c58cf09519131"} Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.408632 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bc4832dc7fe6122b35df3a45aa7aa3e555b74110941802c77590e3684f08c873"} Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.409904 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e3d102b541ac39415b908736c2bfb9db55c160b0add3d98567de3e181d5d9576"} Mar 19 09:22:27 crc kubenswrapper[4835]: I0319 09:22:27.411185 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27c3a7839082acaff8b493856162335c2491081354e134a4aa7a80f8e5c376c5"} Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.616064 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.616182 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:27 crc kubenswrapper[4835]: W0319 09:22:27.731965 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.732050 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:27 crc kubenswrapper[4835]: E0319 09:22:27.742338 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="1.6s" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.101859 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.105623 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.105687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.105701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.105759 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:28 crc kubenswrapper[4835]: E0319 09:22:28.106243 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.116:6443: connect: connection refused" node="crc" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.204324 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:22:28 crc kubenswrapper[4835]: E0319 09:22:28.205247 4835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:28 crc kubenswrapper[4835]: E0319 09:22:28.280802 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.116:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.328729 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.416984 4835 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601" exitCode=0 Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.417104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.417109 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.418270 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.418318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.418340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.420662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.420716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.420766 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.422129 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a" exitCode=0 Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.422199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.422493 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.425371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.425443 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.425465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.426994 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa" exitCode=0 Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.427083 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.427164 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.428312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.428363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.428382 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.428996 4835 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162" exitCode=0 Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.429166 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162"} Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.429233 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430003 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430968 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.430999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:28 crc kubenswrapper[4835]: I0319 09:22:28.431017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.328410 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:29 crc kubenswrapper[4835]: E0319 09:22:29.343237 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="3.2s" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.439364 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d" exitCode=0 Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.439430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.439549 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.440430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.440458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.440469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.444087 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.444076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.444922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.444949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.444960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.446776 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.446815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.446831 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.446817 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.454090 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.454130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.454144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.459066 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.459194 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.462285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.462327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.462351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.465495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.465536 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.465558 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.465576 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be"} Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.706480 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.707947 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.707988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.708006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:29 crc kubenswrapper[4835]: I0319 09:22:29.708037 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:29 crc kubenswrapper[4835]: E0319 09:22:29.708406 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.116:6443: connect: connection refused" node="crc" Mar 19 09:22:29 crc kubenswrapper[4835]: W0319 09:22:29.787989 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:29 crc kubenswrapper[4835]: E0319 09:22:29.788096 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:29 crc kubenswrapper[4835]: W0319 09:22:29.874075 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:22:29 crc kubenswrapper[4835]: E0319 09:22:29.874163 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.470733 4835 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5" exitCode=0 Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.470868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5"} Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.470944 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.473011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.473214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.473242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.476550 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.476605 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.476643 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.476552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64b253028c179a8ac3249b467c963e3bcbb8ad96300610219ed64501f22aca23"} Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.476643 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.477252 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.478934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479046 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479706 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:30 crc kubenswrapper[4835]: I0319 09:22:30.479874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.483204 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec"} Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.483296 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.483246 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.483333 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0"} Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.483477 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1"} Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.484274 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.484352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.484380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.571616 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.571908 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.573291 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.573334 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.573350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:31 crc kubenswrapper[4835]: I0319 09:22:31.610884 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.322915 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.490806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e"} Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.490866 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.490895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f"} Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.490974 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.491707 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.491815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.491838 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.495459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.495507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.495524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.909515 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.911361 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.911431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.911454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:32 crc kubenswrapper[4835]: I0319 09:22:32.911507 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.294798 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.493986 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.494063 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495670 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495776 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:33 crc kubenswrapper[4835]: I0319 09:22:33.495980 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.496920 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.498400 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.498476 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.498504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.945580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.945786 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.946881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.946919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:34 crc kubenswrapper[4835]: I0319 09:22:34.946933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:35 crc kubenswrapper[4835]: I0319 09:22:35.835171 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:35 crc kubenswrapper[4835]: I0319 09:22:35.835511 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:35 crc kubenswrapper[4835]: I0319 09:22:35.837253 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:35 crc kubenswrapper[4835]: I0319 09:22:35.837320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:35 crc kubenswrapper[4835]: I0319 09:22:35.837345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:36 crc kubenswrapper[4835]: E0319 09:22:36.601069 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.161194 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.161409 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.163160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.163233 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.163257 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.333889 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.507388 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.509191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.509252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.509277 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.835844 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.835979 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.855834 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:38 crc kubenswrapper[4835]: I0319 09:22:38.862393 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:39 crc kubenswrapper[4835]: I0319 09:22:39.511195 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:39 crc kubenswrapper[4835]: I0319 09:22:39.514445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:39 crc kubenswrapper[4835]: I0319 09:22:39.515375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:39 crc kubenswrapper[4835]: I0319 09:22:39.515407 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:39 crc kubenswrapper[4835]: I0319 09:22:39.518710 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:40 crc kubenswrapper[4835]: W0319 09:22:40.243410 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.243611 4835 trace.go:236] Trace[1349058582]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:22:30.241) (total time: 10001ms): Mar 19 09:22:40 crc kubenswrapper[4835]: Trace[1349058582]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:22:40.243) Mar 19 09:22:40 crc kubenswrapper[4835]: Trace[1349058582]: [10.00172874s] [10.00172874s] END Mar 19 09:22:40 crc kubenswrapper[4835]: E0319 09:22:40.243656 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.329787 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.400667 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.401234 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.403058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.403099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.403111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.449078 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.513681 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.513794 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.514695 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.514874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.515059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.515079 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.515086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.515480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.533809 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 09:22:40 crc kubenswrapper[4835]: W0319 09:22:40.580856 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 19 09:22:40 crc kubenswrapper[4835]: I0319 09:22:40.580949 4835 trace.go:236] Trace[1382303265]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Mar-2026 09:22:30.579) (total time: 10001ms): Mar 19 09:22:40 crc kubenswrapper[4835]: Trace[1382303265]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:22:40.580) Mar 19 09:22:40 crc kubenswrapper[4835]: Trace[1382303265]: [10.001374439s] [10.001374439s] END Mar 19 09:22:40 crc kubenswrapper[4835]: E0319 09:22:40.580974 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.055546 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.055633 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.374154 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.378699 4835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.383924 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.387801 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.389217 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.390255 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.390318 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 09:22:41 crc kubenswrapper[4835]: W0319 09:22:41.400013 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.400107 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.401633 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.401679 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 09:22:41 crc kubenswrapper[4835]: W0319 09:22:41.405332 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z Mar 19 09:22:41 crc kubenswrapper[4835]: E0319 09:22:41.405404 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.517492 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.519753 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64b253028c179a8ac3249b467c963e3bcbb8ad96300610219ed64501f22aca23" exitCode=255 Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.519787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"64b253028c179a8ac3249b467c963e3bcbb8ad96300610219ed64501f22aca23"} Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.519906 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.519984 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.520009 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.520904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.520931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.520939 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521242 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521404 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.521435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.522126 4835 scope.go:117] "RemoveContainer" containerID="64b253028c179a8ac3249b467c963e3bcbb8ad96300610219ed64501f22aca23" Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.619594 4835 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]log ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]etcd ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/priority-and-fairness-filter ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-apiextensions-informers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-apiextensions-controllers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/crd-informer-synced ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-system-namespaces-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 19 09:22:41 crc kubenswrapper[4835]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/bootstrap-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/start-kube-aggregator-informers ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-registration-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-discovery-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]autoregister-completion ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-openapi-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 19 09:22:41 crc kubenswrapper[4835]: livez check failed Mar 19 09:22:41 crc kubenswrapper[4835]: I0319 09:22:41.619653 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.332038 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:42Z is after 2026-02-23T05:33:13Z Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.523364 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.525544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2"} Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.525706 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.526461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.526501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:42 crc kubenswrapper[4835]: I0319 09:22:42.526511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.333873 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:43Z is after 2026-02-23T05:33:13Z Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.532140 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.533040 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.536204 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" exitCode=255 Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.536280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2"} Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.536352 4835 scope.go:117] "RemoveContainer" containerID="64b253028c179a8ac3249b467c963e3bcbb8ad96300610219ed64501f22aca23" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.536618 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.538647 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.538694 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.538711 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:43 crc kubenswrapper[4835]: I0319 09:22:43.539796 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:22:43 crc kubenswrapper[4835]: E0319 09:22:43.540178 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:22:44 crc kubenswrapper[4835]: I0319 09:22:44.334555 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:44Z is after 2026-02-23T05:33:13Z Mar 19 09:22:44 crc kubenswrapper[4835]: I0319 09:22:44.542978 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 09:22:44 crc kubenswrapper[4835]: W0319 09:22:44.598566 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:44Z is after 2026-02-23T05:33:13Z Mar 19 09:22:44 crc kubenswrapper[4835]: E0319 09:22:44.598670 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:45 crc kubenswrapper[4835]: I0319 09:22:45.332971 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:45Z is after 2026-02-23T05:33:13Z Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.333825 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:46Z is after 2026-02-23T05:33:13Z Mar 19 09:22:46 crc kubenswrapper[4835]: W0319 09:22:46.378481 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:46Z is after 2026-02-23T05:33:13Z Mar 19 09:22:46 crc kubenswrapper[4835]: E0319 09:22:46.378615 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:46 crc kubenswrapper[4835]: E0319 09:22:46.602016 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.619815 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.620162 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.621873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.621937 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.621960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.622930 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:22:46 crc kubenswrapper[4835]: E0319 09:22:46.623275 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:22:46 crc kubenswrapper[4835]: I0319 09:22:46.627869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.331478 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:47Z is after 2026-02-23T05:33:13Z Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.558550 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.559991 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.560058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.560078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.561246 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:22:47 crc kubenswrapper[4835]: E0319 09:22:47.561595 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.631189 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.789941 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:47 crc kubenswrapper[4835]: E0319 09:22:47.791071 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.792329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.792411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.792430 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:47 crc kubenswrapper[4835]: I0319 09:22:47.792474 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:47 crc kubenswrapper[4835]: E0319 09:22:47.797441 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.333412 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:48Z is after 2026-02-23T05:33:13Z Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.561877 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.563962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.564029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.564048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.565190 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:22:48 crc kubenswrapper[4835]: E0319 09:22:48.565519 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.836642 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:22:48 crc kubenswrapper[4835]: I0319 09:22:48.836786 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:49 crc kubenswrapper[4835]: I0319 09:22:49.333052 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:49Z is after 2026-02-23T05:33:13Z Mar 19 09:22:49 crc kubenswrapper[4835]: I0319 09:22:49.763109 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:22:49 crc kubenswrapper[4835]: E0319 09:22:49.773292 4835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:50 crc kubenswrapper[4835]: I0319 09:22:50.332364 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:50Z is after 2026-02-23T05:33:13Z Mar 19 09:22:50 crc kubenswrapper[4835]: W0319 09:22:50.463596 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:50Z is after 2026-02-23T05:33:13Z Mar 19 09:22:50 crc kubenswrapper[4835]: E0319 09:22:50.464075 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:50 crc kubenswrapper[4835]: W0319 09:22:50.581840 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:50Z is after 2026-02-23T05:33:13Z Mar 19 09:22:50 crc kubenswrapper[4835]: E0319 09:22:50.581959 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.054466 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.054718 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.056380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.056464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.056486 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.057622 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:22:51 crc kubenswrapper[4835]: E0319 09:22:51.058022 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:22:51 crc kubenswrapper[4835]: I0319 09:22:51.333256 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:51Z is after 2026-02-23T05:33:13Z Mar 19 09:22:51 crc kubenswrapper[4835]: E0319 09:22:51.381280 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:22:52 crc kubenswrapper[4835]: I0319 09:22:52.333204 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:52Z is after 2026-02-23T05:33:13Z Mar 19 09:22:53 crc kubenswrapper[4835]: I0319 09:22:53.333313 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:53Z is after 2026-02-23T05:33:13Z Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.332783 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:54Z is after 2026-02-23T05:33:13Z Mar 19 09:22:54 crc kubenswrapper[4835]: E0319 09:22:54.797551 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:54Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.797605 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.799256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.799325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.799351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:54 crc kubenswrapper[4835]: I0319 09:22:54.799395 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:22:54 crc kubenswrapper[4835]: E0319 09:22:54.805371 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 09:22:55 crc kubenswrapper[4835]: W0319 09:22:55.318996 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:55Z is after 2026-02-23T05:33:13Z Mar 19 09:22:55 crc kubenswrapper[4835]: E0319 09:22:55.319113 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:55 crc kubenswrapper[4835]: I0319 09:22:55.332878 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:55Z is after 2026-02-23T05:33:13Z Mar 19 09:22:56 crc kubenswrapper[4835]: I0319 09:22:56.332030 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:56Z is after 2026-02-23T05:33:13Z Mar 19 09:22:56 crc kubenswrapper[4835]: E0319 09:22:56.602167 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:22:56 crc kubenswrapper[4835]: W0319 09:22:56.834364 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:56Z is after 2026-02-23T05:33:13Z Mar 19 09:22:56 crc kubenswrapper[4835]: E0319 09:22:56.834492 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:22:57 crc kubenswrapper[4835]: I0319 09:22:57.332605 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:57Z is after 2026-02-23T05:33:13Z Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.331561 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.835948 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.836182 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.836300 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.836665 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.838537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.838613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.838635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.839590 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 09:22:58 crc kubenswrapper[4835]: I0319 09:22:58.839912 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9" gracePeriod=30 Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.333713 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:59Z is after 2026-02-23T05:33:13Z Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.600737 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.601535 4835 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9" exitCode=255 Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.601603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9"} Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.601643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4"} Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.601828 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.603007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.603039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:22:59 crc kubenswrapper[4835]: I0319 09:22:59.603051 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:00 crc kubenswrapper[4835]: I0319 09:23:00.335487 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:00Z is after 2026-02-23T05:33:13Z Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.332598 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:01Z is after 2026-02-23T05:33:13Z Mar 19 09:23:01 crc kubenswrapper[4835]: E0319 09:23:01.387746 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:01Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:01 crc kubenswrapper[4835]: E0319 09:23:01.803369 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.806536 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.808331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.808379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.808403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:01 crc kubenswrapper[4835]: I0319 09:23:01.808446 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:01 crc kubenswrapper[4835]: E0319 09:23:01.813391 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 09:23:02 crc kubenswrapper[4835]: I0319 09:23:02.333000 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:02Z is after 2026-02-23T05:33:13Z Mar 19 09:23:03 crc kubenswrapper[4835]: I0319 09:23:03.331812 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:03Z is after 2026-02-23T05:33:13Z Mar 19 09:23:04 crc kubenswrapper[4835]: I0319 09:23:04.333603 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:04Z is after 2026-02-23T05:33:13Z Mar 19 09:23:04 crc kubenswrapper[4835]: W0319 09:23:04.545234 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:04Z is after 2026-02-23T05:33:13Z Mar 19 09:23:04 crc kubenswrapper[4835]: E0319 09:23:04.545347 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.333252 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:05Z is after 2026-02-23T05:33:13Z Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.401980 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.404210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.404266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.404284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.405173 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:23:05 crc kubenswrapper[4835]: W0319 09:23:05.608979 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:05Z is after 2026-02-23T05:33:13Z Mar 19 09:23:05 crc kubenswrapper[4835]: E0319 09:23:05.609092 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.835929 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.836152 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.837993 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.838034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:05 crc kubenswrapper[4835]: I0319 09:23:05.838045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.206442 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:23:06 crc kubenswrapper[4835]: E0319 09:23:06.212712 4835 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 09:23:06 crc kubenswrapper[4835]: E0319 09:23:06.214052 4835 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.333029 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:06Z is after 2026-02-23T05:33:13Z Mar 19 09:23:06 crc kubenswrapper[4835]: E0319 09:23:06.602283 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.624054 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.624852 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.626994 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" exitCode=255 Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.627045 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad"} Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.627126 4835 scope.go:117] "RemoveContainer" containerID="3e2d5a5ed4759d9904a8184bbafa6d50482b1f6f6e196f1947ae7837057210e2" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.627335 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.628539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.628574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.628583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:06 crc kubenswrapper[4835]: I0319 09:23:06.629197 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:06 crc kubenswrapper[4835]: E0319 09:23:06.629352 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.334082 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:07Z is after 2026-02-23T05:33:13Z Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.630076 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.630580 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.632571 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.633834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.633878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.633893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:07 crc kubenswrapper[4835]: I0319 09:23:07.634557 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:07 crc kubenswrapper[4835]: E0319 09:23:07.634945 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.333496 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.334114 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.334708 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:08Z is after 2026-02-23T05:33:13Z Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.336201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.336236 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.336248 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:08 crc kubenswrapper[4835]: E0319 09:23:08.808960 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:08Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.814377 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.816114 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.816194 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.816217 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.816254 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:08 crc kubenswrapper[4835]: E0319 09:23:08.821846 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.836201 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:23:08 crc kubenswrapper[4835]: I0319 09:23:08.836349 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:23:09 crc kubenswrapper[4835]: I0319 09:23:09.333185 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:23:09Z is after 2026-02-23T05:33:13Z Mar 19 09:23:09 crc kubenswrapper[4835]: W0319 09:23:09.898443 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 09:23:09 crc kubenswrapper[4835]: E0319 09:23:09.898540 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:23:10 crc kubenswrapper[4835]: I0319 09:23:10.335105 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:10 crc kubenswrapper[4835]: W0319 09:23:10.792220 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 09:23:10 crc kubenswrapper[4835]: E0319 09:23:10.792292 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.054243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.054395 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.055831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.055865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.055877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.056438 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.056674 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:11 crc kubenswrapper[4835]: I0319 09:23:11.332866 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.396482 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d0849970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,LastTimestamp:2026-03-19 09:22:26.326100336 +0000 UTC m=+1.174698973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.403181 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.409999 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.416410 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.422807 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2e068caad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.592713389 +0000 UTC m=+1.441311986,LastTimestamp:2026-03-19 09:22:26.592713389 +0000 UTC m=+1.441311986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.429979 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.694798082 +0000 UTC m=+1.543396669,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.436498 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.694825962 +0000 UTC m=+1.543424549,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.443215 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.694835883 +0000 UTC m=+1.543434470,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.449834 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.702807665 +0000 UTC m=+1.551406262,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.456237 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.702830246 +0000 UTC m=+1.551428843,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.463181 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.702846126 +0000 UTC m=+1.551444723,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.470174 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.703732382 +0000 UTC m=+1.552330969,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.477107 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.703771103 +0000 UTC m=+1.552369690,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.484264 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.703781333 +0000 UTC m=+1.552379920,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.491363 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.704452473 +0000 UTC m=+1.553051070,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.497909 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.704503885 +0000 UTC m=+1.553102482,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.504721 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.704519175 +0000 UTC m=+1.553117772,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.511388 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.705489163 +0000 UTC m=+1.554087790,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.519791 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.705577216 +0000 UTC m=+1.554175843,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.527233 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.705601296 +0000 UTC m=+1.554199923,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.534250 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.70573377 +0000 UTC m=+1.554332367,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.541607 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495a7b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495a7b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394326962 +0000 UTC m=+1.242925549,LastTimestamp:2026-03-19 09:22:26.705763391 +0000 UTC m=+1.554361988,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.548721 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d495c65d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d495c65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394334813 +0000 UTC m=+1.242933400,LastTimestamp:2026-03-19 09:22:26.705774081 +0000 UTC m=+1.554372678,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.555865 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.707713768 +0000 UTC m=+1.556312365,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.562466 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e33b2d4957300\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e33b2d4957300 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:26.394313472 +0000 UTC m=+1.242912059,LastTimestamp:2026-03-19 09:22:26.707725558 +0000 UTC m=+1.556324185,Count:9,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.570442 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b2feb1e6d5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.100821205 +0000 UTC m=+1.949419822,LastTimestamp:2026-03-19 09:22:27.100821205 +0000 UTC m=+1.949419822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.576598 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b2febe1715 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.101619989 +0000 UTC m=+1.950218616,LastTimestamp:2026-03-19 09:22:27.101619989 +0000 UTC m=+1.950218616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.586682 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b2ffd859b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.120118197 +0000 UTC m=+1.968716814,LastTimestamp:2026-03-19 09:22:27.120118197 +0000 UTC m=+1.968716814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.593944 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b3018e9abd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.148839613 +0000 UTC m=+1.997438240,LastTimestamp:2026-03-19 09:22:27.148839613 +0000 UTC m=+1.997438240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.600306 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b301f523dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.155559389 +0000 UTC m=+2.004158006,LastTimestamp:2026-03-19 09:22:27.155559389 +0000 UTC m=+2.004158006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.606562 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b3251c4ad8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.745327832 +0000 UTC m=+2.593926419,LastTimestamp:2026-03-19 09:22:27.745327832 +0000 UTC m=+2.593926419,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.613221 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b3251d1352 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.745379154 +0000 UTC m=+2.593977741,LastTimestamp:2026-03-19 09:22:27.745379154 +0000 UTC m=+2.593977741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.619498 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b325245e3f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.745857087 +0000 UTC m=+2.594455684,LastTimestamp:2026-03-19 09:22:27.745857087 +0000 UTC m=+2.594455684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.626797 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b3252557de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.74592099 +0000 UTC m=+2.594519577,LastTimestamp:2026-03-19 09:22:27.74592099 +0000 UTC m=+2.594519577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.633812 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3252a41df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.746243039 +0000 UTC m=+2.594841626,LastTimestamp:2026-03-19 09:22:27.746243039 +0000 UTC m=+2.594841626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.639071 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b325ffa297 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.760226967 +0000 UTC m=+2.608825554,LastTimestamp:2026-03-19 09:22:27.760226967 +0000 UTC m=+2.608825554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.645250 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b325ffc097 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.760234647 +0000 UTC m=+2.608833244,LastTimestamp:2026-03-19 09:22:27.760234647 +0000 UTC m=+2.608833244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.652075 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3260ee0f7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.761225975 +0000 UTC m=+2.609824572,LastTimestamp:2026-03-19 09:22:27.761225975 +0000 UTC m=+2.609824572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.659030 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b326116dae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.76139307 +0000 UTC m=+2.609991657,LastTimestamp:2026-03-19 09:22:27.76139307 +0000 UTC m=+2.609991657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.665334 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b326161af1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.761699569 +0000 UTC m=+2.610298156,LastTimestamp:2026-03-19 09:22:27.761699569 +0000 UTC m=+2.610298156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.671458 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b32622ba17 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.762526743 +0000 UTC m=+2.611125360,LastTimestamp:2026-03-19 09:22:27.762526743 +0000 UTC m=+2.611125360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.677991 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b336662621 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.035380769 +0000 UTC m=+2.883979356,LastTimestamp:2026-03-19 09:22:28.035380769 +0000 UTC m=+2.883979356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.684033 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b33709ec3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.046113851 +0000 UTC m=+2.894712458,LastTimestamp:2026-03-19 09:22:28.046113851 +0000 UTC m=+2.894712458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.691217 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b33718f20c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.04709838 +0000 UTC m=+2.895696967,LastTimestamp:2026-03-19 09:22:28.04709838 +0000 UTC m=+2.895696967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.696604 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b349a405d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.358202839 +0000 UTC m=+3.206801426,LastTimestamp:2026-03-19 09:22:28.358202839 +0000 UTC m=+3.206801426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.702276 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b34a72ce25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.371754533 +0000 UTC m=+3.220353120,LastTimestamp:2026-03-19 09:22:28.371754533 +0000 UTC m=+3.220353120,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.706350 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b34a90d14d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.373721421 +0000 UTC m=+3.222320048,LastTimestamp:2026-03-19 09:22:28.373721421 +0000 UTC m=+3.222320048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.711700 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b34d512de1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.419882465 +0000 UTC m=+3.268481062,LastTimestamp:2026-03-19 09:22:28.419882465 +0000 UTC m=+3.268481062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.719492 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b34de5d139 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.429623609 +0000 UTC m=+3.278222206,LastTimestamp:2026-03-19 09:22:28.429623609 +0000 UTC m=+3.278222206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.723157 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b34dec95ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.430067181 +0000 UTC m=+3.278665798,LastTimestamp:2026-03-19 09:22:28.430067181 +0000 UTC m=+3.278665798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.728776 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b34e103222 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.43240093 +0000 UTC m=+3.280999527,LastTimestamp:2026-03-19 09:22:28.43240093 +0000 UTC m=+3.280999527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.735509 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b356a71dbf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.576509375 +0000 UTC m=+3.425107952,LastTimestamp:2026-03-19 09:22:28.576509375 +0000 UTC m=+3.425107952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.742453 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b3574ff492 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.587574418 +0000 UTC m=+3.436173015,LastTimestamp:2026-03-19 09:22:28.587574418 +0000 UTC m=+3.436173015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.749204 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b358d05eb3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.612767411 +0000 UTC m=+3.461365998,LastTimestamp:2026-03-19 09:22:28.612767411 +0000 UTC m=+3.461365998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.756515 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b3597a5e36 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.623908406 +0000 UTC m=+3.472507013,LastTimestamp:2026-03-19 09:22:28.623908406 +0000 UTC m=+3.472507013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.763074 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b359d6e4fe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.629972222 +0000 UTC m=+3.478570809,LastTimestamp:2026-03-19 09:22:28.629972222 +0000 UTC m=+3.478570809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.770423 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b359dfca03 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.630555139 +0000 UTC m=+3.479153746,LastTimestamp:2026-03-19 09:22:28.630555139 +0000 UTC m=+3.479153746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.777391 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b359ec02d1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.631356113 +0000 UTC m=+3.479954700,LastTimestamp:2026-03-19 09:22:28.631356113 +0000 UTC m=+3.479954700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.784814 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b35b1a0da7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.651150759 +0000 UTC m=+3.499749346,LastTimestamp:2026-03-19 09:22:28.651150759 +0000 UTC m=+3.499749346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.792449 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b35b2b6d5c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.652289372 +0000 UTC m=+3.500887959,LastTimestamp:2026-03-19 09:22:28.652289372 +0000 UTC m=+3.500887959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.800099 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b35b6a0d82 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.656393602 +0000 UTC m=+3.504992189,LastTimestamp:2026-03-19 09:22:28.656393602 +0000 UTC m=+3.504992189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.804613 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b35bc2512a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.66217809 +0000 UTC m=+3.510776677,LastTimestamp:2026-03-19 09:22:28.66217809 +0000 UTC m=+3.510776677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.811414 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e33b35d057a2d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.683356717 +0000 UTC m=+3.531955304,LastTimestamp:2026-03-19 09:22:28.683356717 +0000 UTC m=+3.531955304,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.816387 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b364ded423 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.815041571 +0000 UTC m=+3.663640178,LastTimestamp:2026-03-19 09:22:28.815041571 +0000 UTC m=+3.663640178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.823161 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b3659ae64f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.827366991 +0000 UTC m=+3.675965578,LastTimestamp:2026-03-19 09:22:28.827366991 +0000 UTC m=+3.675965578,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.828073 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b365a77c0a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.828191754 +0000 UTC m=+3.676790341,LastTimestamp:2026-03-19 09:22:28.828191754 +0000 UTC m=+3.676790341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.833380 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b367407968 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.854995304 +0000 UTC m=+3.703593901,LastTimestamp:2026-03-19 09:22:28.854995304 +0000 UTC m=+3.703593901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.839696 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b3682c473b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.870448955 +0000 UTC m=+3.719047542,LastTimestamp:2026-03-19 09:22:28.870448955 +0000 UTC m=+3.719047542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.845283 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b3683cd58e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.871533966 +0000 UTC m=+3.720132573,LastTimestamp:2026-03-19 09:22:28.871533966 +0000 UTC m=+3.720132573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.849814 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b3715d3469 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.024650345 +0000 UTC m=+3.873248952,LastTimestamp:2026-03-19 09:22:29.024650345 +0000 UTC m=+3.873248952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.854357 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e33b372216e56 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.03751023 +0000 UTC m=+3.886108827,LastTimestamp:2026-03-19 09:22:29.03751023 +0000 UTC m=+3.886108827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.861392 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b373dcc6b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.066565296 +0000 UTC m=+3.915163903,LastTimestamp:2026-03-19 09:22:29.066565296 +0000 UTC m=+3.915163903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.867848 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b374cf4451 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.082457169 +0000 UTC m=+3.931055766,LastTimestamp:2026-03-19 09:22:29.082457169 +0000 UTC m=+3.931055766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.872067 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b374df1902 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.083494658 +0000 UTC m=+3.932093255,LastTimestamp:2026-03-19 09:22:29.083494658 +0000 UTC m=+3.932093255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.878348 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b37ff4ff74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.269479284 +0000 UTC m=+4.118077871,LastTimestamp:2026-03-19 09:22:29.269479284 +0000 UTC m=+4.118077871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.886352 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b380ebc37d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.285651325 +0000 UTC m=+4.134249922,LastTimestamp:2026-03-19 09:22:29.285651325 +0000 UTC m=+4.134249922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.893387 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b380fd8c47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.286816839 +0000 UTC m=+4.135415436,LastTimestamp:2026-03-19 09:22:29.286816839 +0000 UTC m=+4.135415436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.901792 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b38a3a5a36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.441796662 +0000 UTC m=+4.290395259,LastTimestamp:2026-03-19 09:22:29.441796662 +0000 UTC m=+4.290395259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.905704 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b38d70adc1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.495688641 +0000 UTC m=+4.344287228,LastTimestamp:2026-03-19 09:22:29.495688641 +0000 UTC m=+4.344287228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.909947 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b38e6ad475 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.512082549 +0000 UTC m=+4.360681146,LastTimestamp:2026-03-19 09:22:29.512082549 +0000 UTC m=+4.360681146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.917104 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3962bbee2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.642165986 +0000 UTC m=+4.490764573,LastTimestamp:2026-03-19 09:22:29.642165986 +0000 UTC m=+4.490764573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.924500 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3974e2638 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.66119788 +0000 UTC m=+4.509796477,LastTimestamp:2026-03-19 09:22:29.66119788 +0000 UTC m=+4.509796477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.932114 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3c7d070c1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.475043009 +0000 UTC m=+5.323641626,LastTimestamp:2026-03-19 09:22:30.475043009 +0000 UTC m=+5.323641626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.940880 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3d7a12e1f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.740381215 +0000 UTC m=+5.588979812,LastTimestamp:2026-03-19 09:22:30.740381215 +0000 UTC m=+5.588979812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.947947 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3d840c4ca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.75084001 +0000 UTC m=+5.599438607,LastTimestamp:2026-03-19 09:22:30.75084001 +0000 UTC m=+5.599438607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.954806 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3d85095db openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.751876571 +0000 UTC m=+5.600475168,LastTimestamp:2026-03-19 09:22:30.751876571 +0000 UTC m=+5.600475168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.961873 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3e5ce5b2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.978222891 +0000 UTC m=+5.826821478,LastTimestamp:2026-03-19 09:22:30.978222891 +0000 UTC m=+5.826821478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.968829 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3e6bdd7d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.993917908 +0000 UTC m=+5.842516495,LastTimestamp:2026-03-19 09:22:30.993917908 +0000 UTC m=+5.842516495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.975628 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3e6cc1cbf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:30.994853055 +0000 UTC m=+5.843451682,LastTimestamp:2026-03-19 09:22:30.994853055 +0000 UTC m=+5.843451682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.982882 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3f605d6e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.250294503 +0000 UTC m=+6.098893100,LastTimestamp:2026-03-19 09:22:31.250294503 +0000 UTC m=+6.098893100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.989569 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3f6d5fc8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.26393563 +0000 UTC m=+6.112534227,LastTimestamp:2026-03-19 09:22:31.26393563 +0000 UTC m=+6.112534227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:11 crc kubenswrapper[4835]: E0319 09:23:11.996707 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b3f6e99648 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.265220168 +0000 UTC m=+6.113818765,LastTimestamp:2026-03-19 09:22:31.265220168 +0000 UTC m=+6.113818765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.003774 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b403533046 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.473467462 +0000 UTC m=+6.322066059,LastTimestamp:2026-03-19 09:22:31.473467462 +0000 UTC m=+6.322066059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.010605 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b40450aa22 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.490079266 +0000 UTC m=+6.338677863,LastTimestamp:2026-03-19 09:22:31.490079266 +0000 UTC m=+6.338677863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.017320 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b4045e9840 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.490992192 +0000 UTC m=+6.339590789,LastTimestamp:2026-03-19 09:22:31.490992192 +0000 UTC m=+6.339590789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.023515 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b4114b51fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.707832827 +0000 UTC m=+6.556431454,LastTimestamp:2026-03-19 09:22:31.707832827 +0000 UTC m=+6.556431454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.029728 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e33b41251c16c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:31.725031788 +0000 UTC m=+6.573630415,LastTimestamp:2026-03-19 09:22:31.725031788 +0000 UTC m=+6.573630415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.037855 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-controller-manager-crc.189e33b5ba29afc0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:12 crc kubenswrapper[4835]: body: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:38.835945408 +0000 UTC m=+13.684544035,LastTimestamp:2026-03-19 09:22:38.835945408 +0000 UTC m=+13.684544035,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.043916 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b5ba2b10b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:38.836035761 +0000 UTC m=+13.684634388,LastTimestamp:2026-03-19 09:22:38.836035761 +0000 UTC m=+13.684634388,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.051057 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-apiserver-crc.189e33b63e77121e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 19 09:23:12 crc kubenswrapper[4835]: body: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.055609374 +0000 UTC m=+15.904207991,LastTimestamp:2026-03-19 09:22:41.055609374 +0000 UTC m=+15.904207991,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.058444 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b63e781080 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.055674496 +0000 UTC m=+15.904273123,LastTimestamp:2026-03-19 09:22:41.055674496 +0000 UTC m=+15.904273123,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.065770 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-apiserver-crc.189e33b65269ff26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 09:23:12 crc kubenswrapper[4835]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 09:23:12 crc kubenswrapper[4835]: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.39029687 +0000 UTC m=+16.238895467,LastTimestamp:2026-03-19 09:22:41.39029687 +0000 UTC m=+16.238895467,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.071424 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b6526ac2c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.390346951 +0000 UTC m=+16.238945558,LastTimestamp:2026-03-19 09:22:41.390346951 +0000 UTC m=+16.238945558,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.078257 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e33b65269ff26\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-apiserver-crc.189e33b65269ff26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 09:23:12 crc kubenswrapper[4835]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 09:23:12 crc kubenswrapper[4835]: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.39029687 +0000 UTC m=+16.238895467,LastTimestamp:2026-03-19 09:22:41.401665131 +0000 UTC m=+16.250263718,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.083715 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e33b6526ac2c7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b6526ac2c7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:41.390346951 +0000 UTC m=+16.238945558,LastTimestamp:2026-03-19 09:22:41.401700762 +0000 UTC m=+16.250299349,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.089093 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e33b380fd8c47\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e33b380fd8c47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:29.286816839 +0000 UTC m=+4.135415436,LastTimestamp:2026-03-19 09:22:41.523269442 +0000 UTC m=+16.371868039,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.098122 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e415e18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:12 crc kubenswrapper[4835]: body: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836718104 +0000 UTC m=+23.685316691,LastTimestamp:2026-03-19 09:22:48.836718104 +0000 UTC m=+23.685316691,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.103207 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e43213d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836833597 +0000 UTC m=+23.685432184,LastTimestamp:2026-03-19 09:22:48.836833597 +0000 UTC m=+23.685432184,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.108281 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b80e415e18\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e415e18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:12 crc kubenswrapper[4835]: body: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836718104 +0000 UTC m=+23.685316691,LastTimestamp:2026-03-19 09:22:58.836115162 +0000 UTC m=+33.684713789,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.114044 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b80e43213d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e43213d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836833597 +0000 UTC m=+23.685432184,LastTimestamp:2026-03-19 09:22:58.836239786 +0000 UTC m=+33.684838403,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.121907 4835 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33ba627d7d11 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:58.839878929 +0000 UTC m=+33.688477546,LastTimestamp:2026-03-19 09:22:58.839878929 +0000 UTC m=+33.688477546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.128210 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b32622ba17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b32622ba17 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:27.762526743 +0000 UTC m=+2.611125360,LastTimestamp:2026-03-19 09:22:58.968837915 +0000 UTC m=+33.817436532,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.135833 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b336662621\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b336662621 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.035380769 +0000 UTC m=+2.883979356,LastTimestamp:2026-03-19 09:22:59.214293545 +0000 UTC m=+34.062892172,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.140697 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b33709ec3b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b33709ec3b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:28.046113851 +0000 UTC m=+2.894712458,LastTimestamp:2026-03-19 09:22:59.224890876 +0000 UTC m=+34.073489493,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.148309 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b80e415e18\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 09:23:12 crc kubenswrapper[4835]: &Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e415e18 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 09:23:12 crc kubenswrapper[4835]: body: Mar 19 09:23:12 crc kubenswrapper[4835]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836718104 +0000 UTC m=+23.685316691,LastTimestamp:2026-03-19 09:23:08.836313134 +0000 UTC m=+43.684911761,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 09:23:12 crc kubenswrapper[4835]: > Mar 19 09:23:12 crc kubenswrapper[4835]: E0319 09:23:12.153257 4835 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e33b80e43213d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e33b80e43213d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:22:48.836833597 +0000 UTC m=+23.685432184,LastTimestamp:2026-03-19 09:23:08.836392546 +0000 UTC m=+43.684991173,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:23:12 crc kubenswrapper[4835]: I0319 09:23:12.333328 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:13 crc kubenswrapper[4835]: I0319 09:23:13.334463 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:14 crc kubenswrapper[4835]: I0319 09:23:14.334212 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.334674 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:15 crc kubenswrapper[4835]: E0319 09:23:15.817358 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.822181 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.933512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.933596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.933625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:15 crc kubenswrapper[4835]: I0319 09:23:15.933672 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:15 crc kubenswrapper[4835]: E0319 09:23:15.942685 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 09:23:16 crc kubenswrapper[4835]: I0319 09:23:16.333443 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:16 crc kubenswrapper[4835]: E0319 09:23:16.603175 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:17 crc kubenswrapper[4835]: I0319 09:23:17.335214 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.335695 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.750485 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.750620 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.751539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.751572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.751582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.754357 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.942547 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.943793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.943848 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:18 crc kubenswrapper[4835]: I0319 09:23:18.943871 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:19 crc kubenswrapper[4835]: I0319 09:23:19.334891 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:20 crc kubenswrapper[4835]: I0319 09:23:20.333119 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.333329 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.578966 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.579236 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.580883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.580957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:21 crc kubenswrapper[4835]: I0319 09:23:21.580977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.333666 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:22 crc kubenswrapper[4835]: E0319 09:23:22.823485 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.943799 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.945370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.945439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.945466 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:22 crc kubenswrapper[4835]: I0319 09:23:22.945512 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:22 crc kubenswrapper[4835]: E0319 09:23:22.953642 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.334537 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.401958 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.403421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.403484 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.403518 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:23 crc kubenswrapper[4835]: I0319 09:23:23.404417 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:23 crc kubenswrapper[4835]: E0319 09:23:23.404695 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:24 crc kubenswrapper[4835]: I0319 09:23:24.331690 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:25 crc kubenswrapper[4835]: I0319 09:23:25.335165 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:26 crc kubenswrapper[4835]: I0319 09:23:26.334769 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:26 crc kubenswrapper[4835]: E0319 09:23:26.604126 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:27 crc kubenswrapper[4835]: I0319 09:23:27.336482 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:28 crc kubenswrapper[4835]: I0319 09:23:28.335257 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.336234 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:29 crc kubenswrapper[4835]: E0319 09:23:29.831564 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.954559 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.956147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.956198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.956213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:29 crc kubenswrapper[4835]: I0319 09:23:29.956252 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:29 crc kubenswrapper[4835]: E0319 09:23:29.962775 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 09:23:30 crc kubenswrapper[4835]: I0319 09:23:30.336091 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:31 crc kubenswrapper[4835]: I0319 09:23:31.335818 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:32 crc kubenswrapper[4835]: I0319 09:23:32.332231 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:33 crc kubenswrapper[4835]: I0319 09:23:33.331784 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:34 crc kubenswrapper[4835]: I0319 09:23:34.331269 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:34 crc kubenswrapper[4835]: W0319 09:23:34.688838 4835 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 09:23:34 crc kubenswrapper[4835]: E0319 09:23:34.689733 4835 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 09:23:35 crc kubenswrapper[4835]: I0319 09:23:35.333301 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.335059 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.401394 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.402781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.402824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.402860 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.403689 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:36 crc kubenswrapper[4835]: E0319 09:23:36.604950 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:36 crc kubenswrapper[4835]: E0319 09:23:36.839330 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.963734 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.965109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.965192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.965206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.965236 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:36 crc kubenswrapper[4835]: E0319 09:23:36.970463 4835 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.991204 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.992716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af"} Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.993179 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.994264 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.994309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:36 crc kubenswrapper[4835]: I0319 09:23:36.994325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:37 crc kubenswrapper[4835]: I0319 09:23:37.335876 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:37 crc kubenswrapper[4835]: I0319 09:23:37.630627 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:23:37 crc kubenswrapper[4835]: I0319 09:23:37.998053 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 09:23:37 crc kubenswrapper[4835]: I0319 09:23:37.998710 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.001618 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" exitCode=255 Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.001671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af"} Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.001719 4835 scope.go:117] "RemoveContainer" containerID="ad17e3b6834b269d4341ba6899e04a25a4eaa7408b2932b4f31e3f41299c74ad" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.001790 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.002726 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.002770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.002780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.003528 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:23:38 crc kubenswrapper[4835]: E0319 09:23:38.003888 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.215271 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.234777 4835 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:23:38 crc kubenswrapper[4835]: I0319 09:23:38.334078 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.007568 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.012027 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.013399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.013464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.013481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.014291 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:23:39 crc kubenswrapper[4835]: E0319 09:23:39.014562 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:39 crc kubenswrapper[4835]: I0319 09:23:39.335705 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:40 crc kubenswrapper[4835]: I0319 09:23:40.346260 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.054825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.055022 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.056151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.056220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.056243 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.057113 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:23:41 crc kubenswrapper[4835]: E0319 09:23:41.057394 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.334306 4835 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.441820 4835 csr.go:261] certificate signing request csr-l5lm5 is approved, waiting to be issued Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.453568 4835 csr.go:257] certificate signing request csr-l5lm5 is issued Mar 19 09:23:41 crc kubenswrapper[4835]: I0319 09:23:41.540111 4835 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 09:23:42 crc kubenswrapper[4835]: I0319 09:23:42.130418 4835 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 09:23:42 crc kubenswrapper[4835]: I0319 09:23:42.455778 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 07:05:55.468658713 +0000 UTC Mar 19 09:23:42 crc kubenswrapper[4835]: I0319 09:23:42.456927 4835 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5973h42m13.01175538s for next certificate rotation Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.970973 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.972417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.972487 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.972512 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.972682 4835 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.984437 4835 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.984813 4835 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 09:23:43 crc kubenswrapper[4835]: E0319 09:23:43.984850 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.989177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.989239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.989261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.989290 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:43 crc kubenswrapper[4835]: I0319 09:23:43.989313 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:43Z","lastTransitionTime":"2026-03-19T09:23:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.010607 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.021156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.021214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.021232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.021255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.021272 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:44Z","lastTransitionTime":"2026-03-19T09:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.042407 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.051690 4835 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.055680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.055777 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.055802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.055831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.055853 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:44Z","lastTransitionTime":"2026-03-19T09:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.073559 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.077607 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.077635 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.077646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.077668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.077679 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:44Z","lastTransitionTime":"2026-03-19T09:23:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.094566 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.094814 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.094851 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.195686 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.296616 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.397409 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.402042 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.403814 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.404043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:44 crc kubenswrapper[4835]: I0319 09:23:44.404068 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.498073 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.598620 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.699377 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.800568 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:44 crc kubenswrapper[4835]: E0319 09:23:44.901492 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.002024 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.102572 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.202922 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.303262 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.403925 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.504407 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.605410 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.705757 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.806595 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:45 crc kubenswrapper[4835]: E0319 09:23:45.907517 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.008132 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.109129 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.210170 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.311861 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.413525 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.514189 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.605663 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.614283 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.714615 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.814848 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:46 crc kubenswrapper[4835]: E0319 09:23:46.915822 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.016206 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.116955 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.217251 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.318104 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.418558 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.519684 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.620822 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.721337 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.821942 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:47 crc kubenswrapper[4835]: E0319 09:23:47.922087 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.023071 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.123877 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.224504 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.325264 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.426366 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.526544 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.627197 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.727869 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.828766 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:48 crc kubenswrapper[4835]: E0319 09:23:48.928945 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.029773 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.130548 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.231779 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.332155 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.433026 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.533989 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.634282 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.734560 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.835688 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:49 crc kubenswrapper[4835]: E0319 09:23:49.936597 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.037846 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.138993 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.239706 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.339901 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.440635 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.541450 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.641949 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.742381 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.842866 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:50 crc kubenswrapper[4835]: E0319 09:23:50.944033 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.044301 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.145387 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.245833 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.346498 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.447585 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.548534 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.648903 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.749383 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.849845 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:51 crc kubenswrapper[4835]: E0319 09:23:51.950224 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.050366 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.150831 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.251255 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.351627 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.452819 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.553832 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.653986 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.754515 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.854913 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:52 crc kubenswrapper[4835]: E0319 09:23:52.955287 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.056190 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.157371 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.257562 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.358627 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.458946 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.560093 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.661205 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.762361 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.862798 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:53 crc kubenswrapper[4835]: E0319 09:23:53.963815 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.064433 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.165222 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.265574 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.365726 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.371968 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.377362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.377416 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.377433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.377457 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.377475 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:54Z","lastTransitionTime":"2026-03-19T09:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.393685 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.398920 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.399001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.399027 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.399056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.399080 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:54Z","lastTransitionTime":"2026-03-19T09:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.415160 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.420431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.420482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.420501 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.420521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.420538 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:54Z","lastTransitionTime":"2026-03-19T09:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.436155 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.440455 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.440513 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.440533 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.440558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.440575 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:54Z","lastTransitionTime":"2026-03-19T09:23:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.456699 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.456966 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.466618 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.567390 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.667824 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: I0319 09:23:54.671047 4835 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.767923 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.868050 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:54 crc kubenswrapper[4835]: E0319 09:23:54.968940 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.069491 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.170320 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.271288 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.372412 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: I0319 09:23:55.401505 4835 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 09:23:55 crc kubenswrapper[4835]: I0319 09:23:55.402632 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:55 crc kubenswrapper[4835]: I0319 09:23:55.402686 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:55 crc kubenswrapper[4835]: I0319 09:23:55.402708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:55 crc kubenswrapper[4835]: I0319 09:23:55.406811 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.408055 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.473231 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.573889 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.674521 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.775376 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.876446 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:55 crc kubenswrapper[4835]: E0319 09:23:55.977390 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.078332 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.179042 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.280282 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.381304 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.481813 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.582734 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.606188 4835 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.684016 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.784589 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.885490 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:56 crc kubenswrapper[4835]: E0319 09:23:56.986044 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.086183 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.186304 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.287343 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.388428 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.488817 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.589335 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.690103 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.790325 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.891072 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:57 crc kubenswrapper[4835]: E0319 09:23:57.992213 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.092898 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.193524 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.293681 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.394376 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.495040 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.596174 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.697556 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.798278 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:58 crc kubenswrapper[4835]: E0319 09:23:58.899414 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:58.999949 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.100336 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.200917 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.302165 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.402803 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.504146 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.605176 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.705598 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: E0319 09:23:59.806070 4835 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.827045 4835 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.909544 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.909978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.910053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.910133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:23:59 crc kubenswrapper[4835]: I0319 09:23:59.910222 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:23:59Z","lastTransitionTime":"2026-03-19T09:23:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.013971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.014562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.015108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.015217 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.015302 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.117355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.117867 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.117945 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.118177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.118262 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.220523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.220572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.220583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.220602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.220617 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.322558 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.322881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.323016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.323137 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.323254 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.426441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.426804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.426919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.427032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.427139 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.529996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.530302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.530447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.530552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.530643 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.633932 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.633990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.634007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.634033 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.634050 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.736645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.736701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.736720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.736804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.736833 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.752157 4835 apiserver.go:52] "Watching apiserver" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.758238 4835 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.758821 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.759442 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.759713 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.759874 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.759966 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.760053 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.760605 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.760810 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.761239 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.761555 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.763585 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.767587 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768164 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768185 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768232 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768251 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768126 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768111 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.768637 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.785807 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.802154 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.822099 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.832852 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.840378 4835 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.840801 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.841003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.841036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.841067 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.841095 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.841823 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.855134 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860406 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860474 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860546 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860582 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860674 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860715 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860777 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860817 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860856 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860896 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860940 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.860989 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861023 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861054 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861085 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861119 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861151 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861217 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861252 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861285 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861318 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861349 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861354 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861580 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861622 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861695 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861727 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861792 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861878 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861923 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862007 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862053 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862097 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862192 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862235 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862282 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861385 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.861615 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862403 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862436 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862499 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862531 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862591 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862621 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862652 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862686 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862719 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862224 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862788 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862286 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862523 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862822 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862869 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862904 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863038 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863069 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863131 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863162 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863195 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863239 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863294 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863360 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863394 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863433 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863565 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863598 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863631 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863663 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863694 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863725 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863827 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863902 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863935 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863967 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863998 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864029 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864064 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864099 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864134 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865216 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865393 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865445 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865484 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865518 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865677 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865728 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865792 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865831 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865899 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865963 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866040 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866080 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866125 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866156 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866190 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866235 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866336 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866374 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866449 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866493 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866592 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866641 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866697 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866788 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866835 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866873 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866908 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866942 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866977 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867011 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867144 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867182 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867224 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867277 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867314 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867350 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867448 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867524 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867565 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867643 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867677 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867850 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867903 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867944 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867978 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868050 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868087 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868125 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868201 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868249 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868301 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868343 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868377 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868452 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868486 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868528 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868565 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868638 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868711 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868784 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868823 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868861 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868901 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868937 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868975 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869049 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869087 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869121 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869161 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869202 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869251 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869297 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869335 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869374 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869563 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869606 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869642 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869719 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869820 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869883 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869950 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870056 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870139 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870232 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870402 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870530 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870607 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870634 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870657 4835 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870678 4835 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870700 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873026 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874904 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876006 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.862929 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.880597 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863697 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.863813 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885332 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864598 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885359 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864680 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.864932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865015 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865209 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.865729 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866010 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866231 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866272 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866762 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.866806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867099 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867143 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867147 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867197 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867418 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867436 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867478 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.867995 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868012 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868010 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885494 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885647 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885690 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868468 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868566 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.868626 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869425 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869594 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886406 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886178 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886537 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869686 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.869883 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870071 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870452 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870829 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.870300 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.871222 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.871291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.871410 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.872093 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.872452 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886725 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.872574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.872850 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.872959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873117 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873258 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873367 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873677 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.873732 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874038 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874274 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874716 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874827 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874838 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.874905 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.875079 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.887037 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:01.386944751 +0000 UTC m=+96.235543338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.887090 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:01.387060624 +0000 UTC m=+96.235659281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887113 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.875253 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.887221 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:01.387213918 +0000 UTC m=+96.235812505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.875947 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876993 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877105 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877132 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.876932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877353 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877346 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877524 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887294 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877789 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877718 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.877821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.878149 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.878236 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.878692 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.878732 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.878892 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.879108 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.879222 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.879891 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.880563 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.881028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.881277 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.881932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.882373 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.882458 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.882633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.882688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.882901 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.883053 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.883069 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.883184 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.883350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.883598 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.884229 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885110 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885117 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885144 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885186 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885183 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.885989 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886086 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.886293 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887322 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887337 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887624 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.887676 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.888039 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.891454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.891640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.891970 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.892248 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.892457 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.892877 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.893338 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.893535 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.893923 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.893962 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.894584 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.894616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.894845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.894311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.896362 4835 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.895432 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.895906 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.896330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.897239 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.897520 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.897950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.898517 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.898810 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.899146 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.905227 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.905254 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.899695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.900356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.900806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.903857 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.903966 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.904028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.904796 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.905367 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:01.405336673 +0000 UTC m=+96.253935390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.906159 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.906240 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.906314 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.906385 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.906418 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.906450 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:00 crc kubenswrapper[4835]: E0319 09:24:00.906691 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:01.406657501 +0000 UTC m=+96.255256138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.906924 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.907366 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.908686 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.909068 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.909379 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.909792 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.910449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.912778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.914009 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.915971 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.921397 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.927152 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.927018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.928213 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.933453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.933462 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.933510 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.933661 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.933932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934221 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934422 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934622 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934693 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934840 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.934949 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.935003 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.935232 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.935606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.936257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.936485 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.936620 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.936825 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.941308 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.943823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.943962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.944052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.944163 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.944261 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:00Z","lastTransitionTime":"2026-03-19T09:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.948427 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971534 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971658 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971680 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971700 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971717 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971734 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971791 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971808 4835 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971825 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971841 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971857 4835 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971872 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971888 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971904 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971932 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971948 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971967 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971983 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.971998 4835 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972013 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972029 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972044 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972101 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972226 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972252 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972314 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972340 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972365 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972393 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972418 4835 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972479 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972496 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972513 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972529 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972545 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972561 4835 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972575 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972590 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972606 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972622 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972637 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972653 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972670 4835 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972688 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972704 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972720 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972765 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972783 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972798 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972815 4835 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972831 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972848 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972865 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972881 4835 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972897 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972912 4835 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972927 4835 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972945 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972960 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972975 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.972991 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973011 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973026 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973042 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973058 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973074 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973090 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973108 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973126 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973143 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973161 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973177 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973193 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973209 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973225 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973240 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973257 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973273 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973290 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973306 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973320 4835 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973335 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973352 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973367 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973382 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973400 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973415 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973430 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973445 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973460 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973473 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973488 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973507 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973522 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973537 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973555 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973574 4835 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973590 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973605 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973622 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973637 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973654 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973671 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973687 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973702 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973719 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973735 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973780 4835 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973797 4835 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973815 4835 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973830 4835 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973847 4835 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973862 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973879 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973895 4835 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973913 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973929 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973945 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973960 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973976 4835 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.973992 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974007 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974024 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974041 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974058 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974075 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974092 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974107 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974125 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974140 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974163 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974179 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974194 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974209 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974224 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974239 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974256 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974273 4835 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974289 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974304 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974321 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974337 4835 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974352 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974366 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974381 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974395 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974409 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974423 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974436 4835 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974453 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974470 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974487 4835 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974502 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974518 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974531 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974546 4835 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974559 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974573 4835 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974594 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974608 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974623 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974634 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974648 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974659 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974671 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974682 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974693 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974704 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974715 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974726 4835 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974760 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974772 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974784 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974796 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974807 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974819 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974830 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974842 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974854 4835 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974866 4835 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974878 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974893 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974905 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974916 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974928 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974939 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974962 4835 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974973 4835 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974984 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:00 crc kubenswrapper[4835]: I0319 09:24:00.974997 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.046368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.046433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.046445 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.046462 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.046472 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.084103 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.099131 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.100513 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:01 crc kubenswrapper[4835]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:01 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:01 crc kubenswrapper[4835]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 09:24:01 crc kubenswrapper[4835]: source /etc/kubernetes/apiserver-url.env Mar 19 09:24:01 crc kubenswrapper[4835]: else Mar 19 09:24:01 crc kubenswrapper[4835]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 09:24:01 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:01 crc kubenswrapper[4835]: fi Mar 19 09:24:01 crc kubenswrapper[4835]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 09:24:01 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:01 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.101649 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.110499 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 09:24:01 crc kubenswrapper[4835]: W0319 09:24:01.112588 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9361ef841b9e76a5886ff44a35c5bb848cef369a16310bfb4531ff36fe8d1be8 WatchSource:0}: Error finding container 9361ef841b9e76a5886ff44a35c5bb848cef369a16310bfb4531ff36fe8d1be8: Status 404 returned error can't find the container with id 9361ef841b9e76a5886ff44a35c5bb848cef369a16310bfb4531ff36fe8d1be8 Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.116969 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:01 crc kubenswrapper[4835]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:01 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:01 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:01 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:01 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:01 crc kubenswrapper[4835]: fi Mar 19 09:24:01 crc kubenswrapper[4835]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 09:24:01 crc kubenswrapper[4835]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 09:24:01 crc kubenswrapper[4835]: ho_enable="--enable-hybrid-overlay" Mar 19 09:24:01 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 09:24:01 crc kubenswrapper[4835]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 09:24:01 crc kubenswrapper[4835]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 09:24:01 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:01 crc kubenswrapper[4835]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 09:24:01 crc kubenswrapper[4835]: --webhook-host=127.0.0.1 \ Mar 19 09:24:01 crc kubenswrapper[4835]: --webhook-port=9743 \ Mar 19 09:24:01 crc kubenswrapper[4835]: ${ho_enable} \ Mar 19 09:24:01 crc kubenswrapper[4835]: --enable-interconnect \ Mar 19 09:24:01 crc kubenswrapper[4835]: --disable-approver \ Mar 19 09:24:01 crc kubenswrapper[4835]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 09:24:01 crc kubenswrapper[4835]: --wait-for-kubernetes-api=200s \ Mar 19 09:24:01 crc kubenswrapper[4835]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 09:24:01 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:01 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:01 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.121416 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:01 crc kubenswrapper[4835]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:01 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:01 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:01 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:01 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:01 crc kubenswrapper[4835]: fi Mar 19 09:24:01 crc kubenswrapper[4835]: Mar 19 09:24:01 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 09:24:01 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:01 crc kubenswrapper[4835]: --disable-webhook \ Mar 19 09:24:01 crc kubenswrapper[4835]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 09:24:01 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:01 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:01 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:01 crc kubenswrapper[4835]: W0319 09:24:01.122133 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-de7ee12128f7f5b96bef7ebf1ddbb03bb1d4e1c37dc43b4df74326c0de645b2d WatchSource:0}: Error finding container de7ee12128f7f5b96bef7ebf1ddbb03bb1d4e1c37dc43b4df74326c0de645b2d: Status 404 returned error can't find the container with id de7ee12128f7f5b96bef7ebf1ddbb03bb1d4e1c37dc43b4df74326c0de645b2d Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.122864 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.125320 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.126493 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.148584 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.148625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.148638 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.148658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.148671 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.251684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.251780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.251799 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.251823 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.251842 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.355103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.355168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.355191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.355220 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.355239 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.457896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.457959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.457977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.458002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.458019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.479512 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.479620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.479687 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:02.479651603 +0000 UTC m=+97.328250220 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.479727 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.479780 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.479844 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:02.479817008 +0000 UTC m=+97.328415655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.479880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.479918 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.479948 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480005 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:02.479991373 +0000 UTC m=+97.328589990 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480046 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480068 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480087 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480107 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480129 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480149 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480133 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:02.480117806 +0000 UTC m=+97.328716503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:01 crc kubenswrapper[4835]: E0319 09:24:01.480214 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:02.480201209 +0000 UTC m=+97.328799826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.560001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.560052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.560066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.560098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.560106 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.662039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.662104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.662121 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.662147 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.662165 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.764439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.764486 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.764500 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.764520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.764531 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.866519 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.866592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.866609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.866633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.866656 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.969857 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.969925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.969943 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.969971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:01 crc kubenswrapper[4835]: I0319 09:24:01.969988 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:01Z","lastTransitionTime":"2026-03-19T09:24:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.072568 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.072631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.072651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.072678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.072696 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.074191 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"de7ee12128f7f5b96bef7ebf1ddbb03bb1d4e1c37dc43b4df74326c0de645b2d"} Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.077312 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.078952 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.079141 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9361ef841b9e76a5886ff44a35c5bb848cef369a16310bfb4531ff36fe8d1be8"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.081847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8445a0f0f060a61d18db81bf20a7c15cd451c6620a5fb50a36bfcdf2fdf0cf5b"} Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.082298 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:02 crc kubenswrapper[4835]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:02 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:02 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:02 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:02 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:02 crc kubenswrapper[4835]: fi Mar 19 09:24:02 crc kubenswrapper[4835]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 09:24:02 crc kubenswrapper[4835]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 09:24:02 crc kubenswrapper[4835]: ho_enable="--enable-hybrid-overlay" Mar 19 09:24:02 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 09:24:02 crc kubenswrapper[4835]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 09:24:02 crc kubenswrapper[4835]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 09:24:02 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:02 crc kubenswrapper[4835]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 09:24:02 crc kubenswrapper[4835]: --webhook-host=127.0.0.1 \ Mar 19 09:24:02 crc kubenswrapper[4835]: --webhook-port=9743 \ Mar 19 09:24:02 crc kubenswrapper[4835]: ${ho_enable} \ Mar 19 09:24:02 crc kubenswrapper[4835]: --enable-interconnect \ Mar 19 09:24:02 crc kubenswrapper[4835]: --disable-approver \ Mar 19 09:24:02 crc kubenswrapper[4835]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 09:24:02 crc kubenswrapper[4835]: --wait-for-kubernetes-api=200s \ Mar 19 09:24:02 crc kubenswrapper[4835]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 09:24:02 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:02 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:02 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.083314 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:02 crc kubenswrapper[4835]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:02 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:02 crc kubenswrapper[4835]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 09:24:02 crc kubenswrapper[4835]: source /etc/kubernetes/apiserver-url.env Mar 19 09:24:02 crc kubenswrapper[4835]: else Mar 19 09:24:02 crc kubenswrapper[4835]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 09:24:02 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:02 crc kubenswrapper[4835]: fi Mar 19 09:24:02 crc kubenswrapper[4835]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 09:24:02 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:02 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.084230 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:02 crc kubenswrapper[4835]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:02 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:02 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:02 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:02 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:02 crc kubenswrapper[4835]: fi Mar 19 09:24:02 crc kubenswrapper[4835]: Mar 19 09:24:02 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 09:24:02 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:02 crc kubenswrapper[4835]: --disable-webhook \ Mar 19 09:24:02 crc kubenswrapper[4835]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 09:24:02 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:02 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:02 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.085443 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.085539 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.091556 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.107824 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.122008 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.139292 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.153911 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.164594 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.176029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.176087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.176104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.176132 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.176148 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.177298 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.192863 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.211529 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.228587 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.241166 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.255065 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.279329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.279384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.279403 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.279427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.279444 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.382162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.382216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.382231 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.382251 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.382266 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.401949 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.402032 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.402089 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.402224 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.402560 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.402641 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.410957 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.414718 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.417217 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.418882 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.421325 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.422555 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.423850 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.425980 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.427244 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.429403 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.430118 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.431471 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.432192 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.432881 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.434021 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.434676 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.435845 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.436358 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.437100 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.438535 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.439499 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.441227 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.442222 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.443872 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.445446 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.446263 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.447854 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.448468 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.449725 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.450335 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.450956 4835 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.451561 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.453574 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.454188 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.455293 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.457231 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.458074 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.459148 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.459967 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.461255 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.461881 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.463097 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.463854 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.465062 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.465636 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.466732 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.467374 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.468756 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.469342 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.470525 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.471113 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.472188 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.472928 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.473487 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.486511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.486592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.486618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.486651 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.486676 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.490868 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.490966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491023 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:04.490989719 +0000 UTC m=+99.339588346 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.491085 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491118 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.491149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491190 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:04.491168584 +0000 UTC m=+99.339767201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.491219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491294 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491322 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491345 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:04.491332269 +0000 UTC m=+99.339930896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491348 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491373 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491410 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:04.491399811 +0000 UTC m=+99.339998428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491437 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491464 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491488 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:02 crc kubenswrapper[4835]: E0319 09:24:02.491559 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:04.491538295 +0000 UTC m=+99.340137032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.589460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.589535 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.589557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.589580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.589596 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.692262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.692312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.692321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.692339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.692348 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.795008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.795053 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.795062 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.795080 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.795091 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.898022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.898083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.898100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.898126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:02 crc kubenswrapper[4835]: I0319 09:24:02.898142 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:02Z","lastTransitionTime":"2026-03-19T09:24:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.001855 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.001925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.001938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.001955 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.001967 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.106174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.106244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.106266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.106352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.106794 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.209323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.209347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.209357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.209369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.209380 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.312320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.312398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.312422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.312459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.312482 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.414941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.414986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.414998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.415011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.415020 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.518729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.518839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.518859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.518891 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.518913 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.622302 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.622339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.622353 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.622371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.622382 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.724964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.725038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.725056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.725084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.725102 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.828493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.828552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.828569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.828593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.828611 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.931399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.931454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.931463 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.931481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:03 crc kubenswrapper[4835]: I0319 09:24:03.931492 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:03Z","lastTransitionTime":"2026-03-19T09:24:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.034820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.034893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.034907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.034948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.034967 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.138938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.139017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.139030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.139067 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.139089 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.243213 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.243284 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.243303 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.243329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.243348 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.346676 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.346780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.346796 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.346816 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.346847 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.401790 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.401888 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.401956 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.402012 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.402185 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.402360 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.449665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.449805 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.449832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.449865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.449886 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.509067 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.509210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.509264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.509303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.509344 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.509493 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.509588 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:08.509560195 +0000 UTC m=+103.358158822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.509970 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510030 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510041 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.509971 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510108 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510117 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510120 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510130 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:08.51008172 +0000 UTC m=+103.358680347 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510270 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:08.510254695 +0000 UTC m=+103.358853302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510304 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:08.510293056 +0000 UTC m=+103.358891663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.510318 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:08.510311357 +0000 UTC m=+103.358909954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.553768 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.553859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.553882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.553913 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.553931 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.657960 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.658032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.658052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.658082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.658101 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.676094 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.676138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.676148 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.676166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.676180 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.686230 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.690969 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.691038 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.691055 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.691082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.691105 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.704682 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.708812 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.708896 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.708909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.708930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.708942 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.731272 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.736078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.736156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.736179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.736212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.736232 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.747998 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.755003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.755064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.755089 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.755120 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.755149 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.772083 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:04 crc kubenswrapper[4835]: E0319 09:24:04.772262 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.774678 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.774791 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.774817 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.774842 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.774865 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.877461 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.877524 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.877537 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.877561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.877586 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.994296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.994363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.994387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.994418 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:04 crc kubenswrapper[4835]: I0319 09:24:04.994439 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:04Z","lastTransitionTime":"2026-03-19T09:24:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.097275 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.097345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.097363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.097388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.097407 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.199260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.199325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.199345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.199374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.199398 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.302384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.302425 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.302436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.302453 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.302465 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.405205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.405247 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.405259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.405273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.405284 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.508086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.508153 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.508173 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.508197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.508215 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.611124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.611193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.611212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.611238 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.611256 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.714836 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.714903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.714922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.714949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.714967 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.723110 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fg29g"] Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.723424 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.725716 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.726155 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.726222 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.747285 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.756479 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.773101 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.788613 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.808537 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.818393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.818457 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.818475 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.818500 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.818518 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.821216 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797kf\" (UniqueName: \"kubernetes.io/projected/4d251d68-4fd1-4d04-b960-260b36d78f3f-kube-api-access-797kf\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.821281 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d251d68-4fd1-4d04-b960-260b36d78f3f-hosts-file\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.824370 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.838487 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921546 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921601 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921621 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:05Z","lastTransitionTime":"2026-03-19T09:24:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797kf\" (UniqueName: \"kubernetes.io/projected/4d251d68-4fd1-4d04-b960-260b36d78f3f-kube-api-access-797kf\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921829 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d251d68-4fd1-4d04-b960-260b36d78f3f-hosts-file\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.921981 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d251d68-4fd1-4d04-b960-260b36d78f3f-hosts-file\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:05 crc kubenswrapper[4835]: I0319 09:24:05.951391 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797kf\" (UniqueName: \"kubernetes.io/projected/4d251d68-4fd1-4d04-b960-260b36d78f3f-kube-api-access-797kf\") pod \"node-resolver-fg29g\" (UID: \"4d251d68-4fd1-4d04-b960-260b36d78f3f\") " pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.024078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.024139 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.024157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.024187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.024204 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.044629 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fg29g" Mar 19 09:24:06 crc kubenswrapper[4835]: W0319 09:24:06.066194 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d251d68_4fd1_4d04_b960_260b36d78f3f.slice/crio-2c6ba51cb46e59d4b5c38e2580287983168d020b45f934ceab28c77ab3449d1e WatchSource:0}: Error finding container 2c6ba51cb46e59d4b5c38e2580287983168d020b45f934ceab28c77ab3449d1e: Status 404 returned error can't find the container with id 2c6ba51cb46e59d4b5c38e2580287983168d020b45f934ceab28c77ab3449d1e Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.073378 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:06 crc kubenswrapper[4835]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:06 crc kubenswrapper[4835]: set -uo pipefail Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 09:24:06 crc kubenswrapper[4835]: HOSTS_FILE="/etc/hosts" Mar 19 09:24:06 crc kubenswrapper[4835]: TEMP_FILE="/etc/hosts.tmp" Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Make a temporary file with the old hosts file's attributes. Mar 19 09:24:06 crc kubenswrapper[4835]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 09:24:06 crc kubenswrapper[4835]: echo "Failed to preserve hosts file. Exiting." Mar 19 09:24:06 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: while true; do Mar 19 09:24:06 crc kubenswrapper[4835]: declare -A svc_ips Mar 19 09:24:06 crc kubenswrapper[4835]: for svc in "${services[@]}"; do Mar 19 09:24:06 crc kubenswrapper[4835]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 09:24:06 crc kubenswrapper[4835]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 09:24:06 crc kubenswrapper[4835]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 09:24:06 crc kubenswrapper[4835]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 09:24:06 crc kubenswrapper[4835]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 09:24:06 crc kubenswrapper[4835]: for i in ${!cmds[*]} Mar 19 09:24:06 crc kubenswrapper[4835]: do Mar 19 09:24:06 crc kubenswrapper[4835]: ips=($(eval "${cmds[i]}")) Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: svc_ips["${svc}"]="${ips[@]}" Mar 19 09:24:06 crc kubenswrapper[4835]: break Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Update /etc/hosts only if we get valid service IPs Mar 19 09:24:06 crc kubenswrapper[4835]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 09:24:06 crc kubenswrapper[4835]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 09:24:06 crc kubenswrapper[4835]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 09:24:06 crc kubenswrapper[4835]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: continue Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Append resolver entries for services Mar 19 09:24:06 crc kubenswrapper[4835]: rc=0 Mar 19 09:24:06 crc kubenswrapper[4835]: for svc in "${!svc_ips[@]}"; do Mar 19 09:24:06 crc kubenswrapper[4835]: for ip in ${svc_ips[${svc}]}; do Mar 19 09:24:06 crc kubenswrapper[4835]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ $rc -ne 0 ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: continue Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 09:24:06 crc kubenswrapper[4835]: # Replace /etc/hosts with our modified version if needed Mar 19 09:24:06 crc kubenswrapper[4835]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 09:24:06 crc kubenswrapper[4835]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: unset svc_ips Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-797kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-fg29g_openshift-dns(4d251d68-4fd1-4d04-b960-260b36d78f3f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:06 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.074608 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-fg29g" podUID="4d251d68-4fd1-4d04-b960-260b36d78f3f" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.082076 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bk84k"] Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.082659 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.085117 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.085480 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.087257 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.087622 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.087958 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.088437 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jl5x4"] Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.090179 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-lkntj"] Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.091209 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.091393 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.094690 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.094770 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.095255 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.095349 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.095358 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.095548 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.096558 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.101501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fg29g" event={"ID":"4d251d68-4fd1-4d04-b960-260b36d78f3f","Type":"ContainerStarted","Data":"2c6ba51cb46e59d4b5c38e2580287983168d020b45f934ceab28c77ab3449d1e"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.104849 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.109717 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:06 crc kubenswrapper[4835]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:06 crc kubenswrapper[4835]: set -uo pipefail Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 09:24:06 crc kubenswrapper[4835]: HOSTS_FILE="/etc/hosts" Mar 19 09:24:06 crc kubenswrapper[4835]: TEMP_FILE="/etc/hosts.tmp" Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Make a temporary file with the old hosts file's attributes. Mar 19 09:24:06 crc kubenswrapper[4835]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 09:24:06 crc kubenswrapper[4835]: echo "Failed to preserve hosts file. Exiting." Mar 19 09:24:06 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: while true; do Mar 19 09:24:06 crc kubenswrapper[4835]: declare -A svc_ips Mar 19 09:24:06 crc kubenswrapper[4835]: for svc in "${services[@]}"; do Mar 19 09:24:06 crc kubenswrapper[4835]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 09:24:06 crc kubenswrapper[4835]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 09:24:06 crc kubenswrapper[4835]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 09:24:06 crc kubenswrapper[4835]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 09:24:06 crc kubenswrapper[4835]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:06 crc kubenswrapper[4835]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 09:24:06 crc kubenswrapper[4835]: for i in ${!cmds[*]} Mar 19 09:24:06 crc kubenswrapper[4835]: do Mar 19 09:24:06 crc kubenswrapper[4835]: ips=($(eval "${cmds[i]}")) Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: svc_ips["${svc}"]="${ips[@]}" Mar 19 09:24:06 crc kubenswrapper[4835]: break Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Update /etc/hosts only if we get valid service IPs Mar 19 09:24:06 crc kubenswrapper[4835]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 09:24:06 crc kubenswrapper[4835]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 09:24:06 crc kubenswrapper[4835]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 09:24:06 crc kubenswrapper[4835]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: continue Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # Append resolver entries for services Mar 19 09:24:06 crc kubenswrapper[4835]: rc=0 Mar 19 09:24:06 crc kubenswrapper[4835]: for svc in "${!svc_ips[@]}"; do Mar 19 09:24:06 crc kubenswrapper[4835]: for ip in ${svc_ips[${svc}]}; do Mar 19 09:24:06 crc kubenswrapper[4835]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: if [[ $rc -ne 0 ]]; then Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: continue Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: Mar 19 09:24:06 crc kubenswrapper[4835]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 09:24:06 crc kubenswrapper[4835]: # Replace /etc/hosts with our modified version if needed Mar 19 09:24:06 crc kubenswrapper[4835]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 09:24:06 crc kubenswrapper[4835]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 09:24:06 crc kubenswrapper[4835]: fi Mar 19 09:24:06 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:06 crc kubenswrapper[4835]: unset svc_ips Mar 19 09:24:06 crc kubenswrapper[4835]: done Mar 19 09:24:06 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-797kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-fg29g_openshift-dns(4d251d68-4fd1-4d04-b960-260b36d78f3f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:06 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.110985 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-fg29g" podUID="4d251d68-4fd1-4d04-b960-260b36d78f3f" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.125944 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.127832 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.127873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.127892 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.128205 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.128253 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.143769 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.155034 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.169880 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.182941 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.195234 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.206877 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.222893 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226179 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-netns\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-kubelet\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226244 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226270 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-daemon-config\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-multus-certs\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226308 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-os-release\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226330 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-conf-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-os-release\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/adf367e5-fedd-4d9e-a7af-345df1f08353-mcd-auth-proxy-config\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-socket-dir-parent\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-k8s-cni-cncf-io\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-bin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.226990 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-multus\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9nz9\" (UniqueName: \"kubernetes.io/projected/0c65689c-afdd-413c-92b9-bf02eeea000c-kube-api-access-k9nz9\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227275 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8s9t\" (UniqueName: \"kubernetes.io/projected/adf367e5-fedd-4d9e-a7af-345df1f08353-kube-api-access-c8s9t\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227331 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-binary-copy\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227377 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/adf367e5-fedd-4d9e-a7af-345df1f08353-proxy-tls\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227425 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-system-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-cnibin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-cnibin\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227614 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/adf367e5-fedd-4d9e-a7af-345df1f08353-rootfs\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227660 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmtx5\" (UniqueName: \"kubernetes.io/projected/4ee35aaa-2819-432a-af95-f1078ad836fc-kube-api-access-dmtx5\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-system-cni-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227807 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-hostroot\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.227924 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.228014 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-cni-binary-copy\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.228061 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-etc-kubernetes\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.231393 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.231426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.231437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.231454 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.231466 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.238171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.252143 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.271311 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.289984 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.303197 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.323314 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329431 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-binary-copy\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329473 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/adf367e5-fedd-4d9e-a7af-345df1f08353-proxy-tls\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-system-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329511 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-cnibin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329673 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-cnibin\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329706 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/adf367e5-fedd-4d9e-a7af-345df1f08353-rootfs\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329732 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmtx5\" (UniqueName: \"kubernetes.io/projected/4ee35aaa-2819-432a-af95-f1078ad836fc-kube-api-access-dmtx5\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329794 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-system-cni-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329816 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-cnibin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329875 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-cnibin\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329909 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/adf367e5-fedd-4d9e-a7af-345df1f08353-rootfs\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329976 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-system-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.329831 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-cni-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-hostroot\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330079 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330113 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-cni-binary-copy\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330145 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-etc-kubernetes\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-netns\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-system-cni-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-kubelet\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330242 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-kubelet\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-daemon-config\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-hostroot\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330514 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-etc-kubernetes\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330594 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-netns\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330289 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330670 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-multus-certs\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-os-release\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330772 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-conf-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330823 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-os-release\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330854 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/adf367e5-fedd-4d9e-a7af-345df1f08353-mcd-auth-proxy-config\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330883 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-socket-dir-parent\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-k8s-cni-cncf-io\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330939 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-bin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330967 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-multus\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.330998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9nz9\" (UniqueName: \"kubernetes.io/projected/0c65689c-afdd-413c-92b9-bf02eeea000c-kube-api-access-k9nz9\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331031 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8s9t\" (UniqueName: \"kubernetes.io/projected/adf367e5-fedd-4d9e-a7af-345df1f08353-kube-api-access-c8s9t\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331237 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-daemon-config\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331238 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331423 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-bin\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331365 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-socket-dir-parent\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331501 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-var-lib-cni-multus\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-k8s-cni-cncf-io\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331649 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4ee35aaa-2819-432a-af95-f1078ad836fc-cni-binary-copy\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331687 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-multus-conf-dir\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-host-run-multus-certs\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331799 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c65689c-afdd-413c-92b9-bf02eeea000c-os-release\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.331855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-binary-copy\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.332076 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/adf367e5-fedd-4d9e-a7af-345df1f08353-mcd-auth-proxy-config\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.332779 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c65689c-afdd-413c-92b9-bf02eeea000c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.339319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.339360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.339375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.339394 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.339409 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.340226 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.340838 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4ee35aaa-2819-432a-af95-f1078ad836fc-os-release\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.343314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/adf367e5-fedd-4d9e-a7af-345df1f08353-proxy-tls\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.352098 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.358179 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8s9t\" (UniqueName: \"kubernetes.io/projected/adf367e5-fedd-4d9e-a7af-345df1f08353-kube-api-access-c8s9t\") pod \"machine-config-daemon-bk84k\" (UID: \"adf367e5-fedd-4d9e-a7af-345df1f08353\") " pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.359609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9nz9\" (UniqueName: \"kubernetes.io/projected/0c65689c-afdd-413c-92b9-bf02eeea000c-kube-api-access-k9nz9\") pod \"multus-additional-cni-plugins-lkntj\" (UID: \"0c65689c-afdd-413c-92b9-bf02eeea000c\") " pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.359717 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmtx5\" (UniqueName: \"kubernetes.io/projected/4ee35aaa-2819-432a-af95-f1078ad836fc-kube-api-access-dmtx5\") pod \"multus-jl5x4\" (UID: \"4ee35aaa-2819-432a-af95-f1078ad836fc\") " pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.366808 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.401729 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.401859 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.402042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.402184 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.402212 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.402391 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.418375 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.421221 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.432534 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.437621 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lkntj" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.441326 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.441370 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.441384 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.441402 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.441416 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.442438 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.443362 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.445686 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jl5x4" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.446083 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.447597 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.455911 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qk6hn"] Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.457413 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.460214 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.460516 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.460930 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.461167 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.461391 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.461629 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.462190 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:24:06 crc kubenswrapper[4835]: W0319 09:24:06.462412 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c65689c_afdd_413c_92b9_bf02eeea000c.slice/crio-96d1efdac811f1328889384d375efe200b0f802727c687a0c2a23015c4f996e8 WatchSource:0}: Error finding container 96d1efdac811f1328889384d375efe200b0f802727c687a0c2a23015c4f996e8: Status 404 returned error can't find the container with id 96d1efdac811f1328889384d375efe200b0f802727c687a0c2a23015c4f996e8 Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.462604 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: W0319 09:24:06.469103 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee35aaa_2819_432a_af95_f1078ad836fc.slice/crio-269d6678429ce945eee52cec87f92cdaafcaff8d9151bfef4252d85edba68aa4 WatchSource:0}: Error finding container 269d6678429ce945eee52cec87f92cdaafcaff8d9151bfef4252d85edba68aa4: Status 404 returned error can't find the container with id 269d6678429ce945eee52cec87f92cdaafcaff8d9151bfef4252d85edba68aa4 Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.469635 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9nz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-lkntj_openshift-multus(0c65689c-afdd-413c-92b9-bf02eeea000c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.471439 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:06 crc kubenswrapper[4835]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 09:24:06 crc kubenswrapper[4835]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 09:24:06 crc kubenswrapper[4835]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmtx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-jl5x4_openshift-multus(4ee35aaa-2819-432a-af95-f1078ad836fc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:06 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.471584 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-lkntj" podUID="0c65689c-afdd-413c-92b9-bf02eeea000c" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.472786 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-jl5x4" podUID="4ee35aaa-2819-432a-af95-f1078ad836fc" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.479391 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.496145 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.508693 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.518149 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.527159 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.533196 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.533316 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.533407 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.533511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.533849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534184 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534228 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534534 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.534910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535481 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535784 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535832 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.535879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48r2z\" (UniqueName: \"kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.543702 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.545482 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.545529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.545553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.545583 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.545601 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.554236 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.567545 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.576858 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.589712 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.602416 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.615864 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636829 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636977 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.636989 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637081 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637179 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637238 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637183 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637275 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637270 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637305 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637372 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637417 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637438 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637344 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637806 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637879 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.637955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48r2z\" (UniqueName: \"kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638186 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638435 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638508 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.638707 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.639780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.644324 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.644534 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.649228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.649268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.649288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.649311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.649327 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.655393 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.660000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48r2z\" (UniqueName: \"kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z\") pod \"ovnkube-node-qk6hn\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.674457 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.686886 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.752028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.752157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.752180 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.752208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.752231 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.786448 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:06 crc kubenswrapper[4835]: W0319 09:24:06.801683 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e27fcad_b6f8_4ce8_9f2b_e112f8ae138b.slice/crio-7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9 WatchSource:0}: Error finding container 7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9: Status 404 returned error can't find the container with id 7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9 Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.806092 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:06 crc kubenswrapper[4835]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 09:24:06 crc kubenswrapper[4835]: apiVersion: v1 Mar 19 09:24:06 crc kubenswrapper[4835]: clusters: Mar 19 09:24:06 crc kubenswrapper[4835]: - cluster: Mar 19 09:24:06 crc kubenswrapper[4835]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 09:24:06 crc kubenswrapper[4835]: server: https://api-int.crc.testing:6443 Mar 19 09:24:06 crc kubenswrapper[4835]: name: default-cluster Mar 19 09:24:06 crc kubenswrapper[4835]: contexts: Mar 19 09:24:06 crc kubenswrapper[4835]: - context: Mar 19 09:24:06 crc kubenswrapper[4835]: cluster: default-cluster Mar 19 09:24:06 crc kubenswrapper[4835]: namespace: default Mar 19 09:24:06 crc kubenswrapper[4835]: user: default-auth Mar 19 09:24:06 crc kubenswrapper[4835]: name: default-context Mar 19 09:24:06 crc kubenswrapper[4835]: current-context: default-context Mar 19 09:24:06 crc kubenswrapper[4835]: kind: Config Mar 19 09:24:06 crc kubenswrapper[4835]: preferences: {} Mar 19 09:24:06 crc kubenswrapper[4835]: users: Mar 19 09:24:06 crc kubenswrapper[4835]: - name: default-auth Mar 19 09:24:06 crc kubenswrapper[4835]: user: Mar 19 09:24:06 crc kubenswrapper[4835]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:06 crc kubenswrapper[4835]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:06 crc kubenswrapper[4835]: EOF Mar 19 09:24:06 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48r2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:06 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:06 crc kubenswrapper[4835]: E0319 09:24:06.807702 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.855368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.855440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.855464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.855495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.855522 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.959106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.959165 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.959182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.959214 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:06 crc kubenswrapper[4835]: I0319 09:24:06.959236 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:06Z","lastTransitionTime":"2026-03-19T09:24:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.061693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.061798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.061828 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.061856 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.061876 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.106261 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.108342 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerStarted","Data":"269d6678429ce945eee52cec87f92cdaafcaff8d9151bfef4252d85edba68aa4"} Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.108448 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:07 crc kubenswrapper[4835]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 09:24:07 crc kubenswrapper[4835]: apiVersion: v1 Mar 19 09:24:07 crc kubenswrapper[4835]: clusters: Mar 19 09:24:07 crc kubenswrapper[4835]: - cluster: Mar 19 09:24:07 crc kubenswrapper[4835]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 09:24:07 crc kubenswrapper[4835]: server: https://api-int.crc.testing:6443 Mar 19 09:24:07 crc kubenswrapper[4835]: name: default-cluster Mar 19 09:24:07 crc kubenswrapper[4835]: contexts: Mar 19 09:24:07 crc kubenswrapper[4835]: - context: Mar 19 09:24:07 crc kubenswrapper[4835]: cluster: default-cluster Mar 19 09:24:07 crc kubenswrapper[4835]: namespace: default Mar 19 09:24:07 crc kubenswrapper[4835]: user: default-auth Mar 19 09:24:07 crc kubenswrapper[4835]: name: default-context Mar 19 09:24:07 crc kubenswrapper[4835]: current-context: default-context Mar 19 09:24:07 crc kubenswrapper[4835]: kind: Config Mar 19 09:24:07 crc kubenswrapper[4835]: preferences: {} Mar 19 09:24:07 crc kubenswrapper[4835]: users: Mar 19 09:24:07 crc kubenswrapper[4835]: - name: default-auth Mar 19 09:24:07 crc kubenswrapper[4835]: user: Mar 19 09:24:07 crc kubenswrapper[4835]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:07 crc kubenswrapper[4835]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:07 crc kubenswrapper[4835]: EOF Mar 19 09:24:07 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48r2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:07 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.109796 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.111279 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:07 crc kubenswrapper[4835]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 09:24:07 crc kubenswrapper[4835]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 09:24:07 crc kubenswrapper[4835]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmtx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-jl5x4_openshift-multus(4ee35aaa-2819-432a-af95-f1078ad836fc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:07 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.111824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerStarted","Data":"96d1efdac811f1328889384d375efe200b0f802727c687a0c2a23015c4f996e8"} Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.112373 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-jl5x4" podUID="4ee35aaa-2819-432a-af95-f1078ad836fc" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.113699 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9nz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-lkntj_openshift-multus(0c65689c-afdd-413c-92b9-bf02eeea000c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.114764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"ae4cd638f1152bd633d41be9306a63f81dccddb2b958b31164f1c89afa4a1729"} Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.114976 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-lkntj" podUID="0c65689c-afdd-413c-92b9-bf02eeea000c" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.116820 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.119127 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:07 crc kubenswrapper[4835]: E0319 09:24:07.120471 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.124540 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.140833 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.155219 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.166977 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.167009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.167019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.167040 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.167052 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.179244 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.193029 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.208393 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.220367 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.236036 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.251680 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.262698 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.269657 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.269701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.269717 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.269770 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.269788 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.281017 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.295727 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.306368 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.324870 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.338046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.354710 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.369765 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.372262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.372311 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.372325 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.372343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.372357 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.383311 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.399654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.425833 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.441058 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.454849 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.475240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.475306 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.475318 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.475338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.475351 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.578472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.578556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.578581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.578649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.578673 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.681829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.681897 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.681915 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.681964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.681982 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.784472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.784540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.784562 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.784591 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.784614 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.887907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.887964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.887980 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.888002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.888019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.990640 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.990697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.990713 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.990735 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:07 crc kubenswrapper[4835]: I0319 09:24:07.990777 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:07Z","lastTransitionTime":"2026-03-19T09:24:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.093699 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.093803 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.093822 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.093844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.093863 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.196611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.196663 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.196698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.196724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.196779 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.301234 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.301312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.301333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.301368 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.301393 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.401902 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.401922 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.402421 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.402484 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.402694 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.402839 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.404556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.404628 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.404654 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.404687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.404709 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.508128 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.508381 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.508409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.508438 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.508460 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.555947 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.556100 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.556171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.556276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.556340 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556646 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556707 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556805 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:16.556719481 +0000 UTC m=+111.405318108 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556821 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556848 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556903 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.556928 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:16.556894846 +0000 UTC m=+111.405493473 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.557031 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:16.55700679 +0000 UTC m=+111.405605447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.557057 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:16.557044761 +0000 UTC m=+111.405643378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.557307 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.557345 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.557367 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:08 crc kubenswrapper[4835]: E0319 09:24:08.558917 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:16.558889342 +0000 UTC m=+111.407488059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.610930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.611007 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.611034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.611107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.611163 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.714809 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.714865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.714882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.714908 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.714925 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.823332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.824314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.824342 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.824371 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.824393 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.927789 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.927852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.927872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.927900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:08 crc kubenswrapper[4835]: I0319 09:24:08.927922 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:08Z","lastTransitionTime":"2026-03-19T09:24:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.030831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.030906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.030930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.030964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.030988 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.134409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.134473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.134496 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.134521 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.134539 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.237815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.237883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.237900 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.237926 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.237943 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.341022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.341072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.341085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.341105 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.341117 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.416122 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.416422 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:24:09 crc kubenswrapper[4835]: E0319 09:24:09.416681 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.443778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.443800 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.443808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.443818 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.443827 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.546292 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.546347 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.546360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.546379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.546392 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.649680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.649760 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.649781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.649807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.649825 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.753171 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.753289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.753315 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.753349 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.753373 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.856323 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.856399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.856420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.856444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.856461 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.959549 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.959625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.959646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.959671 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:09 crc kubenswrapper[4835]: I0319 09:24:09.959692 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:09Z","lastTransitionTime":"2026-03-19T09:24:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.062229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.062313 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.062331 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.062357 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.062376 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.123809 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:24:10 crc kubenswrapper[4835]: E0319 09:24:10.124083 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.166100 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.166166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.166188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.166218 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.166239 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.269065 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.269113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.269129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.269151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.269167 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.372159 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.372240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.372265 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.372295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.372318 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.401086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.401135 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.401169 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:10 crc kubenswrapper[4835]: E0319 09:24:10.401273 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:10 crc kubenswrapper[4835]: E0319 09:24:10.401406 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:10 crc kubenswrapper[4835]: E0319 09:24:10.401515 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.476329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.476408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.476433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.476469 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.476490 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.579576 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.579656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.579679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.579710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.579733 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.682587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.682643 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.682661 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.682688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.682705 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.785984 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.786076 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.786099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.786135 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.786158 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.889587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.889924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.889990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.890095 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.890170 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.992733 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.992854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.992874 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.992903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:10 crc kubenswrapper[4835]: I0319 09:24:10.992922 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:10Z","lastTransitionTime":"2026-03-19T09:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.096206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.096268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.096282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.096300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.096316 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.200378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.200474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.200495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.200520 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.200534 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.304781 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.304854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.304878 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.304911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.304933 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.411729 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.411824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.411839 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.411866 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.411881 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.429885 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.515379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.515460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.515474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.515498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.515512 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.619555 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.619611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.619625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.619646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.619664 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.722288 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.722390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.722411 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.722473 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.722496 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.825540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.825619 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.825645 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.825675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.825699 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.929278 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.929319 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.929332 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.929350 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:11 crc kubenswrapper[4835]: I0319 09:24:11.929362 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:11Z","lastTransitionTime":"2026-03-19T09:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.032539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.032624 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.032649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.032684 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.032715 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.136076 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.136144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.136161 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.136188 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.136205 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.239931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.240239 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.240439 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.240660 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.240875 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.271261 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ppv6m"] Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.271760 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.274792 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.275068 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.274875 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.275852 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.292310 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.303926 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.320090 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.329517 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.344256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.344494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.344626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.344784 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.344924 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.346342 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.362505 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.378607 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.395671 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.401835 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.401952 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.402007 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.402101 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.402212 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.402325 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.402500 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dfdfe13-1f47-4774-89d0-5d861607ddbc-host\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.402562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6dfdfe13-1f47-4774-89d0-5d861607ddbc-serviceca\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.402601 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4lg\" (UniqueName: \"kubernetes.io/projected/6dfdfe13-1f47-4774-89d0-5d861607ddbc-kube-api-access-kt4lg\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.404868 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:12 crc kubenswrapper[4835]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:12 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:12 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:12 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:12 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:12 crc kubenswrapper[4835]: fi Mar 19 09:24:12 crc kubenswrapper[4835]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 09:24:12 crc kubenswrapper[4835]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 09:24:12 crc kubenswrapper[4835]: ho_enable="--enable-hybrid-overlay" Mar 19 09:24:12 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 09:24:12 crc kubenswrapper[4835]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 09:24:12 crc kubenswrapper[4835]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 09:24:12 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:12 crc kubenswrapper[4835]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 09:24:12 crc kubenswrapper[4835]: --webhook-host=127.0.0.1 \ Mar 19 09:24:12 crc kubenswrapper[4835]: --webhook-port=9743 \ Mar 19 09:24:12 crc kubenswrapper[4835]: ${ho_enable} \ Mar 19 09:24:12 crc kubenswrapper[4835]: --enable-interconnect \ Mar 19 09:24:12 crc kubenswrapper[4835]: --disable-approver \ Mar 19 09:24:12 crc kubenswrapper[4835]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 09:24:12 crc kubenswrapper[4835]: --wait-for-kubernetes-api=200s \ Mar 19 09:24:12 crc kubenswrapper[4835]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 09:24:12 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:12 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:12 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.409567 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:12 crc kubenswrapper[4835]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:12 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:12 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:12 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:12 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:12 crc kubenswrapper[4835]: fi Mar 19 09:24:12 crc kubenswrapper[4835]: Mar 19 09:24:12 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 09:24:12 crc kubenswrapper[4835]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 09:24:12 crc kubenswrapper[4835]: --disable-webhook \ Mar 19 09:24:12 crc kubenswrapper[4835]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 09:24:12 crc kubenswrapper[4835]: --loglevel="${LOGLEVEL}" Mar 19 09:24:12 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:12 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.410878 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.417146 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.431213 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.444673 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.447986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.448066 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.448084 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.448111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.448130 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.458820 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.472534 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.504082 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dfdfe13-1f47-4774-89d0-5d861607ddbc-host\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.504171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6dfdfe13-1f47-4774-89d0-5d861607ddbc-serviceca\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.503925 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.504214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4lg\" (UniqueName: \"kubernetes.io/projected/6dfdfe13-1f47-4774-89d0-5d861607ddbc-kube-api-access-kt4lg\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.504300 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6dfdfe13-1f47-4774-89d0-5d861607ddbc-host\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.506399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6dfdfe13-1f47-4774-89d0-5d861607ddbc-serviceca\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.523313 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4lg\" (UniqueName: \"kubernetes.io/projected/6dfdfe13-1f47-4774-89d0-5d861607ddbc-kube-api-access-kt4lg\") pod \"node-ca-ppv6m\" (UID: \"6dfdfe13-1f47-4774-89d0-5d861607ddbc\") " pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.551149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.551216 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.551230 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.551256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.551271 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.593315 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ppv6m" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.611655 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:12 crc kubenswrapper[4835]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 19 09:24:12 crc kubenswrapper[4835]: while [ true ]; Mar 19 09:24:12 crc kubenswrapper[4835]: do Mar 19 09:24:12 crc kubenswrapper[4835]: for f in $(ls /tmp/serviceca); do Mar 19 09:24:12 crc kubenswrapper[4835]: echo $f Mar 19 09:24:12 crc kubenswrapper[4835]: ca_file_path="/tmp/serviceca/${f}" Mar 19 09:24:12 crc kubenswrapper[4835]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 19 09:24:12 crc kubenswrapper[4835]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 19 09:24:12 crc kubenswrapper[4835]: if [ -e "${reg_dir_path}" ]; then Mar 19 09:24:12 crc kubenswrapper[4835]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 19 09:24:12 crc kubenswrapper[4835]: else Mar 19 09:24:12 crc kubenswrapper[4835]: mkdir $reg_dir_path Mar 19 09:24:12 crc kubenswrapper[4835]: cp $ca_file_path $reg_dir_path/ca.crt Mar 19 09:24:12 crc kubenswrapper[4835]: fi Mar 19 09:24:12 crc kubenswrapper[4835]: done Mar 19 09:24:12 crc kubenswrapper[4835]: for d in $(ls /etc/docker/certs.d); do Mar 19 09:24:12 crc kubenswrapper[4835]: echo $d Mar 19 09:24:12 crc kubenswrapper[4835]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 19 09:24:12 crc kubenswrapper[4835]: reg_conf_path="/tmp/serviceca/${dp}" Mar 19 09:24:12 crc kubenswrapper[4835]: if [ ! -e "${reg_conf_path}" ]; then Mar 19 09:24:12 crc kubenswrapper[4835]: rm -rf /etc/docker/certs.d/$d Mar 19 09:24:12 crc kubenswrapper[4835]: fi Mar 19 09:24:12 crc kubenswrapper[4835]: done Mar 19 09:24:12 crc kubenswrapper[4835]: sleep 60 & wait ${!} Mar 19 09:24:12 crc kubenswrapper[4835]: done Mar 19 09:24:12 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kt4lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-ppv6m_openshift-image-registry(6dfdfe13-1f47-4774-89d0-5d861607ddbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:12 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:12 crc kubenswrapper[4835]: E0319 09:24:12.613934 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-ppv6m" podUID="6dfdfe13-1f47-4774-89d0-5d861607ddbc" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.653710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.653767 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.653782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.653805 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.653816 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.757581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.757679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.757703 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.757780 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.757807 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.860480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.860557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.860581 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.860616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.860637 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.963658 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.963981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.964111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.964269 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:12 crc kubenswrapper[4835]: I0319 09:24:12.964403 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:12Z","lastTransitionTime":"2026-03-19T09:24:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.067862 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.067917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.067933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.067967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.068001 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.135370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ppv6m" event={"ID":"6dfdfe13-1f47-4774-89d0-5d861607ddbc","Type":"ContainerStarted","Data":"abe748a1610241d6a8d9a34d50147dacb8ae0ff0ba76f7c84a1a1cd59bf84fa0"} Mar 19 09:24:13 crc kubenswrapper[4835]: E0319 09:24:13.138516 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:13 crc kubenswrapper[4835]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 19 09:24:13 crc kubenswrapper[4835]: while [ true ]; Mar 19 09:24:13 crc kubenswrapper[4835]: do Mar 19 09:24:13 crc kubenswrapper[4835]: for f in $(ls /tmp/serviceca); do Mar 19 09:24:13 crc kubenswrapper[4835]: echo $f Mar 19 09:24:13 crc kubenswrapper[4835]: ca_file_path="/tmp/serviceca/${f}" Mar 19 09:24:13 crc kubenswrapper[4835]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 19 09:24:13 crc kubenswrapper[4835]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 19 09:24:13 crc kubenswrapper[4835]: if [ -e "${reg_dir_path}" ]; then Mar 19 09:24:13 crc kubenswrapper[4835]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 19 09:24:13 crc kubenswrapper[4835]: else Mar 19 09:24:13 crc kubenswrapper[4835]: mkdir $reg_dir_path Mar 19 09:24:13 crc kubenswrapper[4835]: cp $ca_file_path $reg_dir_path/ca.crt Mar 19 09:24:13 crc kubenswrapper[4835]: fi Mar 19 09:24:13 crc kubenswrapper[4835]: done Mar 19 09:24:13 crc kubenswrapper[4835]: for d in $(ls /etc/docker/certs.d); do Mar 19 09:24:13 crc kubenswrapper[4835]: echo $d Mar 19 09:24:13 crc kubenswrapper[4835]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 19 09:24:13 crc kubenswrapper[4835]: reg_conf_path="/tmp/serviceca/${dp}" Mar 19 09:24:13 crc kubenswrapper[4835]: if [ ! -e "${reg_conf_path}" ]; then Mar 19 09:24:13 crc kubenswrapper[4835]: rm -rf /etc/docker/certs.d/$d Mar 19 09:24:13 crc kubenswrapper[4835]: fi Mar 19 09:24:13 crc kubenswrapper[4835]: done Mar 19 09:24:13 crc kubenswrapper[4835]: sleep 60 & wait ${!} Mar 19 09:24:13 crc kubenswrapper[4835]: done Mar 19 09:24:13 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kt4lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-ppv6m_openshift-image-registry(6dfdfe13-1f47-4774-89d0-5d861607ddbc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:13 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:13 crc kubenswrapper[4835]: E0319 09:24:13.141920 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-ppv6m" podUID="6dfdfe13-1f47-4774-89d0-5d861607ddbc" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.153513 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.168446 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.172153 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.172206 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.172223 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.172250 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.172269 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.184662 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.212476 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.230307 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.245238 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.255001 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.267608 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.280375 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.280487 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.280507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.280538 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.280559 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.293387 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.308116 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.315862 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.327025 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.338801 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.354957 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.383017 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.383047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.383059 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.383078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.383092 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.485971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.485996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.486006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.486023 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.486034 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.587919 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.587934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.587941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.587950 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.587957 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.652335 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.652405 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:13 crc kubenswrapper[4835]: E0319 09:24:13.652461 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:13 crc kubenswrapper[4835]: E0319 09:24:13.652605 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.690143 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.690195 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.690212 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.690232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.690244 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.794778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.794890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.794914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.794942 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.794959 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.898606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.898674 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.898698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.898724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:13 crc kubenswrapper[4835]: I0319 09:24:13.898778 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:13Z","lastTransitionTime":"2026-03-19T09:24:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.007237 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.007894 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.008022 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.008133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.008219 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.113129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.113208 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.113228 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.113258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.113279 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.216844 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.216922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.216941 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.216971 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.216989 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.319925 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.320001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.320019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.320044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.320064 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.401109 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:14 crc kubenswrapper[4835]: E0319 09:24:14.401324 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.423297 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.423363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.423376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.423396 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.423408 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.527041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.527111 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.527125 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.527152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.527166 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.630811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.630873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.630883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.630903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.630914 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.734229 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.734283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.734299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.734320 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.734333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.837528 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.837572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.837585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.837604 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.837617 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.941346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.941426 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.941444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.941470 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.941485 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.954953 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.955002 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.955015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.955034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.955049 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: E0319 09:24:14.968382 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.973893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.973962 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.973986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.974011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.974025 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:14 crc kubenswrapper[4835]: E0319 09:24:14.986540 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.991286 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.991358 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.991379 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.991408 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:14 crc kubenswrapper[4835]: I0319 09:24:14.991463 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:14Z","lastTransitionTime":"2026-03-19T09:24:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.004908 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.009557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.009629 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.009655 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.009687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.009707 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.022659 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.027210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.027255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.027273 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.027295 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.027312 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.041971 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.042205 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.044853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.044927 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.044952 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.044978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.044999 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.147001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.147064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.147083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.147103 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.147117 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.250806 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.250885 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.250906 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.250940 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.250960 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.354191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.354262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.354282 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.354312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.354333 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.401539 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.401557 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.401878 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.402004 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.404572 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.405091 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:15 crc kubenswrapper[4835]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:15 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:15 crc kubenswrapper[4835]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 09:24:15 crc kubenswrapper[4835]: source /etc/kubernetes/apiserver-url.env Mar 19 09:24:15 crc kubenswrapper[4835]: else Mar 19 09:24:15 crc kubenswrapper[4835]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 09:24:15 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:15 crc kubenswrapper[4835]: fi Mar 19 09:24:15 crc kubenswrapper[4835]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 09:24:15 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:15 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.407631 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 09:24:15 crc kubenswrapper[4835]: E0319 09:24:15.407769 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.458041 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.458108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.458170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.458196 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.458214 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.561192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.561355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.561383 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.561414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.561437 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.664914 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.664967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.664986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.665012 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.665030 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.768008 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.768085 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.768109 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.768142 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.768167 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.870944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.870988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.870998 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.871011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.871019 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.973266 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.973310 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.973321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.973336 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:15 crc kubenswrapper[4835]: I0319 09:24:15.973346 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:15Z","lastTransitionTime":"2026-03-19T09:24:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.076314 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.076378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.076395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.076419 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.076437 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.179031 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.179126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.179170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.179204 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.179230 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.284827 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.284864 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.284873 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.284887 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.284895 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.386664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.386690 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.386698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.386710 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.386718 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.401479 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.401669 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.417582 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.432253 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.447127 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.475960 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.489557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.489626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.489648 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.489679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.489701 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.494649 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.516047 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.532363 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.547016 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.563918 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.587204 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.587375 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.587442 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.587493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.587638 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:24:32.587602719 +0000 UTC m=+127.436201336 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.587714 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.587807 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:32.587793434 +0000 UTC m=+127.436392051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.588352 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588485 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588536 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:32.588519506 +0000 UTC m=+127.437118123 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588614 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588636 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588652 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588692 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:32.58868015 +0000 UTC m=+127.437278767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588788 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588806 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588819 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:16 crc kubenswrapper[4835]: E0319 09:24:16.588859 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:24:32.588845705 +0000 UTC m=+127.437444322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.598708 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.598820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.598846 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.598877 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.598901 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.604379 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.635020 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.647819 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.661713 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.670621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.702097 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.702168 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.702192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.702219 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.702248 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.805069 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.805118 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.805134 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.805156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.805173 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.907582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.907622 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.907631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.907646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:16 crc kubenswrapper[4835]: I0319 09:24:16.907656 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:16Z","lastTransitionTime":"2026-03-19T09:24:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.011149 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.011210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.011221 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.011244 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.011257 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.114883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.114916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.114976 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.115006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.115040 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.221909 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.221972 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.221992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.222024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.222048 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.325504 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.325590 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.325616 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.325649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.325673 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.401785 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:17 crc kubenswrapper[4835]: E0319 09:24:17.402047 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.402566 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:17 crc kubenswrapper[4835]: E0319 09:24:17.402657 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:17 crc kubenswrapper[4835]: E0319 09:24:17.405946 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:17 crc kubenswrapper[4835]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 09:24:17 crc kubenswrapper[4835]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 09:24:17 crc kubenswrapper[4835]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmtx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-jl5x4_openshift-multus(4ee35aaa-2819-432a-af95-f1078ad836fc): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:17 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:17 crc kubenswrapper[4835]: E0319 09:24:17.407190 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-jl5x4" podUID="4ee35aaa-2819-432a-af95-f1078ad836fc" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.428924 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.428990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.429003 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.429026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.429042 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.532296 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.532345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.532356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.532376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.532387 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.636335 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.636401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.636415 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.636441 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.636458 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.740182 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.740248 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.740261 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.740285 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.740299 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.846599 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.846675 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.846688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.846720 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.846762 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.950688 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.950766 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.950779 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.950802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:17 crc kubenswrapper[4835]: I0319 09:24:17.950822 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:17Z","lastTransitionTime":"2026-03-19T09:24:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.100618 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.100722 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.100774 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.100804 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.100819 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.131969 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq"] Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.132850 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.136061 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.137447 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.147638 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.166635 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.179561 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.196321 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.204106 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.204160 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.204178 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.204197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.204210 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.207687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.207728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.207833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.207869 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkvh\" (UniqueName: \"kubernetes.io/projected/417fb0b4-abf2-4fec-abfe-70a08c00f899-kube-api-access-tfkvh\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.212086 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.227922 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.243957 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.263773 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.277491 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.288863 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.299390 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308351 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkvh\" (UniqueName: \"kubernetes.io/projected/417fb0b4-abf2-4fec-abfe-70a08c00f899-kube-api-access-tfkvh\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308450 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308586 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308659 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308693 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.308717 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.309236 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.309627 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.309936 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.315231 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/417fb0b4-abf2-4fec-abfe-70a08c00f899-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.324231 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.327901 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkvh\" (UniqueName: \"kubernetes.io/projected/417fb0b4-abf2-4fec-abfe-70a08c00f899-kube-api-access-tfkvh\") pod \"ovnkube-control-plane-749d76644c-p4qwq\" (UID: \"417fb0b4-abf2-4fec-abfe-70a08c00f899\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.343314 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.352335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.401992 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.402938 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.404059 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.406897 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8s9t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.408205 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.412054 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.412086 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.412098 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.412115 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.412125 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.452086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" Mar 19 09:24:18 crc kubenswrapper[4835]: W0319 09:24:18.468820 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417fb0b4_abf2_4fec_abfe_70a08c00f899.slice/crio-4870ea3e36d30d7ed8155bcf5e034bfc2e3e1ffd684ac038356a3fee16a3eae6 WatchSource:0}: Error finding container 4870ea3e36d30d7ed8155bcf5e034bfc2e3e1ffd684ac038356a3fee16a3eae6: Status 404 returned error can't find the container with id 4870ea3e36d30d7ed8155bcf5e034bfc2e3e1ffd684ac038356a3fee16a3eae6 Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.472102 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:18 crc kubenswrapper[4835]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:18 crc kubenswrapper[4835]: set -euo pipefail Mar 19 09:24:18 crc kubenswrapper[4835]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 19 09:24:18 crc kubenswrapper[4835]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 19 09:24:18 crc kubenswrapper[4835]: # As the secret mount is optional we must wait for the files to be present. Mar 19 09:24:18 crc kubenswrapper[4835]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 19 09:24:18 crc kubenswrapper[4835]: TS=$(date +%s) Mar 19 09:24:18 crc kubenswrapper[4835]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 19 09:24:18 crc kubenswrapper[4835]: HAS_LOGGED_INFO=0 Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: log_missing_certs(){ Mar 19 09:24:18 crc kubenswrapper[4835]: CUR_TS=$(date +%s) Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 19 09:24:18 crc kubenswrapper[4835]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 19 09:24:18 crc kubenswrapper[4835]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 19 09:24:18 crc kubenswrapper[4835]: HAS_LOGGED_INFO=1 Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: } Mar 19 09:24:18 crc kubenswrapper[4835]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 19 09:24:18 crc kubenswrapper[4835]: log_missing_certs Mar 19 09:24:18 crc kubenswrapper[4835]: sleep 5 Mar 19 09:24:18 crc kubenswrapper[4835]: done Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 19 09:24:18 crc kubenswrapper[4835]: exec /usr/bin/kube-rbac-proxy \ Mar 19 09:24:18 crc kubenswrapper[4835]: --logtostderr \ Mar 19 09:24:18 crc kubenswrapper[4835]: --secure-listen-address=:9108 \ Mar 19 09:24:18 crc kubenswrapper[4835]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 19 09:24:18 crc kubenswrapper[4835]: --upstream=http://127.0.0.1:29108/ \ Mar 19 09:24:18 crc kubenswrapper[4835]: --tls-private-key-file=${TLS_PK} \ Mar 19 09:24:18 crc kubenswrapper[4835]: --tls-cert-file=${TLS_CERT} Mar 19 09:24:18 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p4qwq_openshift-ovn-kubernetes(417fb0b4-abf2-4fec-abfe-70a08c00f899): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:18 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.475550 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:18 crc kubenswrapper[4835]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:18 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:18 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v4_join_subnet_opt= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v6_join_subnet_opt= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v4_transit_switch_subnet_opt= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v6_transit_switch_subnet_opt= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: dns_name_resolver_enabled_flag= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "false" == "true" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: persistent_ips_enabled_flag= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "true" == "true" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: # This is needed so that converting clusters from GA to TP Mar 19 09:24:18 crc kubenswrapper[4835]: # will rollout control plane pods as well Mar 19 09:24:18 crc kubenswrapper[4835]: network_segmentation_enabled_flag= Mar 19 09:24:18 crc kubenswrapper[4835]: multi_network_enabled_flag= Mar 19 09:24:18 crc kubenswrapper[4835]: if [[ "true" == "true" ]]; then Mar 19 09:24:18 crc kubenswrapper[4835]: multi_network_enabled_flag="--enable-multi-network" Mar 19 09:24:18 crc kubenswrapper[4835]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 19 09:24:18 crc kubenswrapper[4835]: fi Mar 19 09:24:18 crc kubenswrapper[4835]: Mar 19 09:24:18 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 19 09:24:18 crc kubenswrapper[4835]: exec /usr/bin/ovnkube \ Mar 19 09:24:18 crc kubenswrapper[4835]: --enable-interconnect \ Mar 19 09:24:18 crc kubenswrapper[4835]: --init-cluster-manager "${K8S_NODE}" \ Mar 19 09:24:18 crc kubenswrapper[4835]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 19 09:24:18 crc kubenswrapper[4835]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 19 09:24:18 crc kubenswrapper[4835]: --metrics-bind-address "127.0.0.1:29108" \ Mar 19 09:24:18 crc kubenswrapper[4835]: --metrics-enable-pprof \ Mar 19 09:24:18 crc kubenswrapper[4835]: --metrics-enable-config-duration \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${ovn_v4_join_subnet_opt} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${ovn_v6_join_subnet_opt} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${dns_name_resolver_enabled_flag} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${persistent_ips_enabled_flag} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${multi_network_enabled_flag} \ Mar 19 09:24:18 crc kubenswrapper[4835]: ${network_segmentation_enabled_flag} Mar 19 09:24:18 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p4qwq_openshift-ovn-kubernetes(417fb0b4-abf2-4fec-abfe-70a08c00f899): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:18 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.478036 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" podUID="417fb0b4-abf2-4fec-abfe-70a08c00f899" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.514336 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.514387 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.514401 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.514420 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.514433 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.617609 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.617677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.617700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.617728 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.617784 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.720687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.720790 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.720811 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.720835 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.720854 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.824024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.824087 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.824104 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.824130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.824148 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.872388 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vs6hx"] Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.873534 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:18 crc kubenswrapper[4835]: E0319 09:24:18.873731 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.893575 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.909108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.917520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr99b\" (UniqueName: \"kubernetes.io/projected/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-kube-api-access-lr99b\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.917639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.924863 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.926942 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.927006 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.927032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.927060 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.927080 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:18Z","lastTransitionTime":"2026-03-19T09:24:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.955993 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.968298 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.981151 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:18 crc kubenswrapper[4835]: I0319 09:24:18.993594 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.006201 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.019141 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr99b\" (UniqueName: \"kubernetes.io/projected/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-kube-api-access-lr99b\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.019231 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.019395 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.019462 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:19.519439035 +0000 UTC m=+114.368037662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.030514 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.030580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.030600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.030627 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.030645 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.033071 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.045062 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.053347 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr99b\" (UniqueName: \"kubernetes.io/projected/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-kube-api-access-lr99b\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.061888 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.075637 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.089481 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.102591 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.119534 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.129611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.132893 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.133045 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.133112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.133177 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.133240 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.154571 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" event={"ID":"417fb0b4-abf2-4fec-abfe-70a08c00f899","Type":"ContainerStarted","Data":"4870ea3e36d30d7ed8155bcf5e034bfc2e3e1ffd684ac038356a3fee16a3eae6"} Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.157283 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:19 crc kubenswrapper[4835]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:19 crc kubenswrapper[4835]: set -euo pipefail Mar 19 09:24:19 crc kubenswrapper[4835]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 19 09:24:19 crc kubenswrapper[4835]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 19 09:24:19 crc kubenswrapper[4835]: # As the secret mount is optional we must wait for the files to be present. Mar 19 09:24:19 crc kubenswrapper[4835]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 19 09:24:19 crc kubenswrapper[4835]: TS=$(date +%s) Mar 19 09:24:19 crc kubenswrapper[4835]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 19 09:24:19 crc kubenswrapper[4835]: HAS_LOGGED_INFO=0 Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: log_missing_certs(){ Mar 19 09:24:19 crc kubenswrapper[4835]: CUR_TS=$(date +%s) Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 19 09:24:19 crc kubenswrapper[4835]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 19 09:24:19 crc kubenswrapper[4835]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 19 09:24:19 crc kubenswrapper[4835]: HAS_LOGGED_INFO=1 Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: } Mar 19 09:24:19 crc kubenswrapper[4835]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 19 09:24:19 crc kubenswrapper[4835]: log_missing_certs Mar 19 09:24:19 crc kubenswrapper[4835]: sleep 5 Mar 19 09:24:19 crc kubenswrapper[4835]: done Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 19 09:24:19 crc kubenswrapper[4835]: exec /usr/bin/kube-rbac-proxy \ Mar 19 09:24:19 crc kubenswrapper[4835]: --logtostderr \ Mar 19 09:24:19 crc kubenswrapper[4835]: --secure-listen-address=:9108 \ Mar 19 09:24:19 crc kubenswrapper[4835]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 19 09:24:19 crc kubenswrapper[4835]: --upstream=http://127.0.0.1:29108/ \ Mar 19 09:24:19 crc kubenswrapper[4835]: --tls-private-key-file=${TLS_PK} \ Mar 19 09:24:19 crc kubenswrapper[4835]: --tls-cert-file=${TLS_CERT} Mar 19 09:24:19 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p4qwq_openshift-ovn-kubernetes(417fb0b4-abf2-4fec-abfe-70a08c00f899): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:19 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.161398 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:19 crc kubenswrapper[4835]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ -f "/env/_master" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: set -o allexport Mar 19 09:24:19 crc kubenswrapper[4835]: source "/env/_master" Mar 19 09:24:19 crc kubenswrapper[4835]: set +o allexport Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v4_join_subnet_opt= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v6_join_subnet_opt= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v4_transit_switch_subnet_opt= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v6_transit_switch_subnet_opt= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "" != "" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: dns_name_resolver_enabled_flag= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "false" == "true" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: persistent_ips_enabled_flag= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "true" == "true" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: # This is needed so that converting clusters from GA to TP Mar 19 09:24:19 crc kubenswrapper[4835]: # will rollout control plane pods as well Mar 19 09:24:19 crc kubenswrapper[4835]: network_segmentation_enabled_flag= Mar 19 09:24:19 crc kubenswrapper[4835]: multi_network_enabled_flag= Mar 19 09:24:19 crc kubenswrapper[4835]: if [[ "true" == "true" ]]; then Mar 19 09:24:19 crc kubenswrapper[4835]: multi_network_enabled_flag="--enable-multi-network" Mar 19 09:24:19 crc kubenswrapper[4835]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 19 09:24:19 crc kubenswrapper[4835]: fi Mar 19 09:24:19 crc kubenswrapper[4835]: Mar 19 09:24:19 crc kubenswrapper[4835]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 19 09:24:19 crc kubenswrapper[4835]: exec /usr/bin/ovnkube \ Mar 19 09:24:19 crc kubenswrapper[4835]: --enable-interconnect \ Mar 19 09:24:19 crc kubenswrapper[4835]: --init-cluster-manager "${K8S_NODE}" \ Mar 19 09:24:19 crc kubenswrapper[4835]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 19 09:24:19 crc kubenswrapper[4835]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 19 09:24:19 crc kubenswrapper[4835]: --metrics-bind-address "127.0.0.1:29108" \ Mar 19 09:24:19 crc kubenswrapper[4835]: --metrics-enable-pprof \ Mar 19 09:24:19 crc kubenswrapper[4835]: --metrics-enable-config-duration \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${ovn_v4_join_subnet_opt} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${ovn_v6_join_subnet_opt} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${dns_name_resolver_enabled_flag} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${persistent_ips_enabled_flag} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${multi_network_enabled_flag} \ Mar 19 09:24:19 crc kubenswrapper[4835]: ${network_segmentation_enabled_flag} Mar 19 09:24:19 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tfkvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-p4qwq_openshift-ovn-kubernetes(417fb0b4-abf2-4fec-abfe-70a08c00f899): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:19 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.162612 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" podUID="417fb0b4-abf2-4fec-abfe-70a08c00f899" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.168942 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.178844 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.187479 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.206930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.219174 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.233793 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.238903 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.238951 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.238967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.238990 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.239003 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.244627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.270076 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.282372 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.296260 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.308612 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.317527 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.325586 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.331184 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.340163 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.341698 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.341902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.342083 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.342128 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.342156 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.347511 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.401857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.401882 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.402015 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.402202 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.446088 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.446152 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.446170 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.446197 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.446219 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.525473 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.525676 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:19 crc kubenswrapper[4835]: E0319 09:24:19.525781 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:20.525735151 +0000 UTC m=+115.374333728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.550064 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.550129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.550141 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.550174 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.550195 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.656211 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.656262 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.656276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.656301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.656317 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.761815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.761880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.761904 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.761935 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.761958 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.865593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.865652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.865664 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.865687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.865698 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.968802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.968974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.968999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.969029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:19 crc kubenswrapper[4835]: I0319 09:24:19.969049 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:19Z","lastTransitionTime":"2026-03-19T09:24:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.072308 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.072352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.072364 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.072380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.072389 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.175594 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.175649 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.175665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.175687 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.175704 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.278391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.278455 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.278480 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.278505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.278521 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.382571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.382636 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.382883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.382917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.382936 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.401007 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.401079 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.401199 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.401567 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.403869 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:20 crc kubenswrapper[4835]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 09:24:20 crc kubenswrapper[4835]: set -uo pipefail Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 09:24:20 crc kubenswrapper[4835]: HOSTS_FILE="/etc/hosts" Mar 19 09:24:20 crc kubenswrapper[4835]: TEMP_FILE="/etc/hosts.tmp" Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: # Make a temporary file with the old hosts file's attributes. Mar 19 09:24:20 crc kubenswrapper[4835]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 09:24:20 crc kubenswrapper[4835]: echo "Failed to preserve hosts file. Exiting." Mar 19 09:24:20 crc kubenswrapper[4835]: exit 1 Mar 19 09:24:20 crc kubenswrapper[4835]: fi Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: while true; do Mar 19 09:24:20 crc kubenswrapper[4835]: declare -A svc_ips Mar 19 09:24:20 crc kubenswrapper[4835]: for svc in "${services[@]}"; do Mar 19 09:24:20 crc kubenswrapper[4835]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 09:24:20 crc kubenswrapper[4835]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 09:24:20 crc kubenswrapper[4835]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 09:24:20 crc kubenswrapper[4835]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 09:24:20 crc kubenswrapper[4835]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:20 crc kubenswrapper[4835]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:20 crc kubenswrapper[4835]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 09:24:20 crc kubenswrapper[4835]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 09:24:20 crc kubenswrapper[4835]: for i in ${!cmds[*]} Mar 19 09:24:20 crc kubenswrapper[4835]: do Mar 19 09:24:20 crc kubenswrapper[4835]: ips=($(eval "${cmds[i]}")) Mar 19 09:24:20 crc kubenswrapper[4835]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 09:24:20 crc kubenswrapper[4835]: svc_ips["${svc}"]="${ips[@]}" Mar 19 09:24:20 crc kubenswrapper[4835]: break Mar 19 09:24:20 crc kubenswrapper[4835]: fi Mar 19 09:24:20 crc kubenswrapper[4835]: done Mar 19 09:24:20 crc kubenswrapper[4835]: done Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: # Update /etc/hosts only if we get valid service IPs Mar 19 09:24:20 crc kubenswrapper[4835]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 09:24:20 crc kubenswrapper[4835]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 09:24:20 crc kubenswrapper[4835]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 09:24:20 crc kubenswrapper[4835]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 09:24:20 crc kubenswrapper[4835]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 09:24:20 crc kubenswrapper[4835]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 09:24:20 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:20 crc kubenswrapper[4835]: continue Mar 19 09:24:20 crc kubenswrapper[4835]: fi Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: # Append resolver entries for services Mar 19 09:24:20 crc kubenswrapper[4835]: rc=0 Mar 19 09:24:20 crc kubenswrapper[4835]: for svc in "${!svc_ips[@]}"; do Mar 19 09:24:20 crc kubenswrapper[4835]: for ip in ${svc_ips[${svc}]}; do Mar 19 09:24:20 crc kubenswrapper[4835]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 09:24:20 crc kubenswrapper[4835]: done Mar 19 09:24:20 crc kubenswrapper[4835]: done Mar 19 09:24:20 crc kubenswrapper[4835]: if [[ $rc -ne 0 ]]; then Mar 19 09:24:20 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:20 crc kubenswrapper[4835]: continue Mar 19 09:24:20 crc kubenswrapper[4835]: fi Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: Mar 19 09:24:20 crc kubenswrapper[4835]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 09:24:20 crc kubenswrapper[4835]: # Replace /etc/hosts with our modified version if needed Mar 19 09:24:20 crc kubenswrapper[4835]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 09:24:20 crc kubenswrapper[4835]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 09:24:20 crc kubenswrapper[4835]: fi Mar 19 09:24:20 crc kubenswrapper[4835]: sleep 60 & wait Mar 19 09:24:20 crc kubenswrapper[4835]: unset svc_ips Mar 19 09:24:20 crc kubenswrapper[4835]: done Mar 19 09:24:20 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-797kf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-fg29g_openshift-dns(4d251d68-4fd1-4d04-b960-260b36d78f3f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:20 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.405121 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-fg29g" podUID="4d251d68-4fd1-4d04-b960-260b36d78f3f" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.485569 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.485646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.485669 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.485701 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.485725 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.538393 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.538594 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:20 crc kubenswrapper[4835]: E0319 09:24:20.538666 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:22.538645703 +0000 UTC m=+117.387244320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.589902 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.589976 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.589999 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.590034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.590057 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.693917 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.694234 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.694432 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.694608 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.694816 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.797557 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.797992 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.798198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.798397 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.798587 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.902179 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.902271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.902304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.902333 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:20 crc kubenswrapper[4835]: I0319 09:24:20.902351 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:20Z","lastTransitionTime":"2026-03-19T09:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.005021 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.005725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.005948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.006133 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.006275 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.110435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.110912 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.111082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.111376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.111598 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.220934 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.221613 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.221652 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.221689 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.221729 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.325058 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.325108 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.325126 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.325156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.325173 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.401876 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.401872 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.402536 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.402667 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.405502 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:24:21 crc kubenswrapper[4835]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 09:24:21 crc kubenswrapper[4835]: apiVersion: v1 Mar 19 09:24:21 crc kubenswrapper[4835]: clusters: Mar 19 09:24:21 crc kubenswrapper[4835]: - cluster: Mar 19 09:24:21 crc kubenswrapper[4835]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 09:24:21 crc kubenswrapper[4835]: server: https://api-int.crc.testing:6443 Mar 19 09:24:21 crc kubenswrapper[4835]: name: default-cluster Mar 19 09:24:21 crc kubenswrapper[4835]: contexts: Mar 19 09:24:21 crc kubenswrapper[4835]: - context: Mar 19 09:24:21 crc kubenswrapper[4835]: cluster: default-cluster Mar 19 09:24:21 crc kubenswrapper[4835]: namespace: default Mar 19 09:24:21 crc kubenswrapper[4835]: user: default-auth Mar 19 09:24:21 crc kubenswrapper[4835]: name: default-context Mar 19 09:24:21 crc kubenswrapper[4835]: current-context: default-context Mar 19 09:24:21 crc kubenswrapper[4835]: kind: Config Mar 19 09:24:21 crc kubenswrapper[4835]: preferences: {} Mar 19 09:24:21 crc kubenswrapper[4835]: users: Mar 19 09:24:21 crc kubenswrapper[4835]: - name: default-auth Mar 19 09:24:21 crc kubenswrapper[4835]: user: Mar 19 09:24:21 crc kubenswrapper[4835]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:21 crc kubenswrapper[4835]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 09:24:21 crc kubenswrapper[4835]: EOF Mar 19 09:24:21 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48r2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 09:24:21 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.406651 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.406702 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9nz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-lkntj_openshift-multus(0c65689c-afdd-413c-92b9-bf02eeea000c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 09:24:21 crc kubenswrapper[4835]: E0319 09:24:21.408678 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-lkntj" podUID="0c65689c-afdd-413c-92b9-bf02eeea000c" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.427938 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.428009 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.428026 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.428057 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.428076 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.532162 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.532232 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.532258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.532293 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.532320 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.635301 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.635346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.635356 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.635373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.635384 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.739491 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.739979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.740252 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.740507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.740665 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.844422 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.844854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.845032 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.845191 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.845337 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.948589 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.948665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.948682 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.948712 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:21 crc kubenswrapper[4835]: I0319 09:24:21.948731 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:21Z","lastTransitionTime":"2026-03-19T09:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.052448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.052529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.052547 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.052573 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.052591 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.156433 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.156532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.156559 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.156596 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.156621 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.260460 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.260532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.260550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.260579 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.260601 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.364343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.364424 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.364435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.364459 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.364475 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.401115 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.401148 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:22 crc kubenswrapper[4835]: E0319 09:24:22.401355 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:22 crc kubenswrapper[4835]: E0319 09:24:22.401421 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.402646 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.467448 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.467510 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.467529 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.467554 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.467570 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.564503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:22 crc kubenswrapper[4835]: E0319 09:24:22.564821 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:22 crc kubenswrapper[4835]: E0319 09:24:22.564943 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:26.564907756 +0000 UTC m=+121.413506383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.570967 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.571072 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.571099 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.571129 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.571151 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.674386 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.674427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.674440 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.674458 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.674473 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.776933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.776982 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.776996 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.777016 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.777028 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.880961 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.881015 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.881029 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.881050 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.881066 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.983592 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.983631 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.983644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.983662 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:22 crc kubenswrapper[4835]: I0319 09:24:22.983674 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:22Z","lastTransitionTime":"2026-03-19T09:24:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.087260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.087304 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.087321 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.087343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.087360 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.169098 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.170865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.171636 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.188018 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.189398 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.189472 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.189494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.189526 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.189547 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.202961 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.219792 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.233979 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.259679 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.272990 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.285803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.291882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.291921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.291930 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.291948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.291960 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.299302 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.311690 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.327094 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.354192 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.369873 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.381303 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.394677 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.394716 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.394725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.394761 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.394774 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.399224 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.401496 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:23 crc kubenswrapper[4835]: E0319 09:24:23.401592 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.401499 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:23 crc kubenswrapper[4835]: E0319 09:24:23.401929 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.413397 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.432374 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.498283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.498343 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.498360 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.498385 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.498404 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.601793 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.601861 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.601880 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.601910 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.601934 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.705585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.705633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.705646 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.705665 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.705677 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.809805 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.809859 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.809872 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.809921 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.809936 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.913561 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.913639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.913656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.913680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:23 crc kubenswrapper[4835]: I0319 09:24:23.913697 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:23Z","lastTransitionTime":"2026-03-19T09:24:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.017467 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.017552 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.017563 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.017587 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.017600 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.120572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.120633 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.120650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.120673 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.120691 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.224344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.224409 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.224437 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.224481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.224505 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.328429 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.328523 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.328546 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.328571 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.328588 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.401641 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.401708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:24 crc kubenswrapper[4835]: E0319 09:24:24.401850 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:24 crc kubenswrapper[4835]: E0319 09:24:24.401978 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.431929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.431988 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.432011 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.432036 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.432056 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.535142 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.535271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.535300 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.535330 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.535348 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.638494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.638585 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.638611 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.638644 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.638662 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.741317 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.741344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.741355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.741369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.741380 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.844065 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.844124 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.844144 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.844167 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.844185 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.946981 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.947024 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.947034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.947052 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:24 crc kubenswrapper[4835]: I0319 09:24:24.947064 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:24Z","lastTransitionTime":"2026-03-19T09:24:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.050224 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.050532 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.050714 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.050949 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.051090 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.153808 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.153852 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.153863 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.153876 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.153887 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.197890 4835 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.256580 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.256639 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.256656 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.256679 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.256697 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.289369 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.289412 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.289427 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.289447 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.289462 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.304073 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.308359 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.308417 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.308435 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.308464 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.308484 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.320534 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.324030 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.324138 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.324156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.324176 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.324191 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.333438 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.336709 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.336782 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.336798 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.336820 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.336836 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.347004 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.350550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.350593 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.350606 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.350625 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.350639 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.360620 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.360806 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.362113 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.362175 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.362186 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.362200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.362210 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.401283 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.401338 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.401485 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:25 crc kubenswrapper[4835]: E0319 09:24:25.401643 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.464948 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.464979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.464986 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.465000 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.465008 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.569492 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.569553 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.569578 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.569610 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.569627 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.697246 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.697329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.697348 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.697378 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.697400 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.800918 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.800979 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.800995 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.801019 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.801038 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.903736 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.903834 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.903851 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.903881 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:25 crc kubenswrapper[4835]: I0319 09:24:25.903901 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:25Z","lastTransitionTime":"2026-03-19T09:24:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.007255 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.007312 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.007328 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.007352 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.007370 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:26Z","lastTransitionTime":"2026-03-19T09:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.110650 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.110868 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.110928 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.110959 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.110983 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:26Z","lastTransitionTime":"2026-03-19T09:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.213813 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.213890 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.213907 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.213931 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.213952 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:26Z","lastTransitionTime":"2026-03-19T09:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.316853 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.316929 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.316956 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.316987 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.317012 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:26Z","lastTransitionTime":"2026-03-19T09:24:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.401208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.401363 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.401638 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.401874 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.417697 4835 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.418056 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.432797 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.465164 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.487257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.504439 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.520256 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.531519 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.542531 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.570554 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.590905 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.603354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.607665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.607939 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.608038 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:34.608008118 +0000 UTC m=+129.456606745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:26 crc kubenswrapper[4835]: E0319 09:24:26.626174 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.628211 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.640272 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.657475 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.673635 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:26 crc kubenswrapper[4835]: I0319 09:24:26.688317 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 09:24:27 crc kubenswrapper[4835]: I0319 09:24:27.401115 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:27 crc kubenswrapper[4835]: E0319 09:24:27.401337 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:27 crc kubenswrapper[4835]: I0319 09:24:27.401467 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:27 crc kubenswrapper[4835]: E0319 09:24:27.401925 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.192100 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c"} Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.192555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b"} Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.194622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ppv6m" event={"ID":"6dfdfe13-1f47-4774-89d0-5d861607ddbc","Type":"ContainerStarted","Data":"16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315"} Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.213949 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.235932 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.257933 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.273546 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.308183 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.331438 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.349191 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.368138 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.387339 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.403009 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.403084 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:28 crc kubenswrapper[4835]: E0319 09:24:28.403346 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:28 crc kubenswrapper[4835]: E0319 09:24:28.403477 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.408992 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.441930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.462068 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.480068 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.495795 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.507075 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.521613 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.534870 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.546569 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.558824 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.568354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.589217 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.603015 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.613110 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.625130 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.638671 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.651151 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.679585 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.694120 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.703257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.719878 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.731520 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:28 crc kubenswrapper[4835]: I0319 09:24:28.751107 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:28Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.199822 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531"} Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.219277 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.235075 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.261326 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.283493 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.306594 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.328540 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.345620 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.362396 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.385815 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.401860 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.401912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:29 crc kubenswrapper[4835]: E0319 09:24:29.402018 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:29 crc kubenswrapper[4835]: E0319 09:24:29.402175 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.405621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.417872 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.437074 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.451274 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.467684 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.483563 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:29 crc kubenswrapper[4835]: I0319 09:24:29.500595 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:29Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:30 crc kubenswrapper[4835]: I0319 09:24:30.401154 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:30 crc kubenswrapper[4835]: I0319 09:24:30.401291 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:30 crc kubenswrapper[4835]: E0319 09:24:30.401677 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:30 crc kubenswrapper[4835]: E0319 09:24:30.402105 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.230779 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerStarted","Data":"9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e"} Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.233900 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" event={"ID":"417fb0b4-abf2-4fec-abfe-70a08c00f899","Type":"ContainerStarted","Data":"dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2"} Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.233981 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" event={"ID":"417fb0b4-abf2-4fec-abfe-70a08c00f899","Type":"ContainerStarted","Data":"3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5"} Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.249978 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.265064 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.279834 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.292634 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.305193 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.340591 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.363051 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.381710 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.393447 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.401930 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.401958 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:31 crc kubenswrapper[4835]: E0319 09:24:31.402073 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:31 crc kubenswrapper[4835]: E0319 09:24:31.402233 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.409096 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.429850 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.459699 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.485108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.500932 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.532541 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.552199 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.564433 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.581508 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.603284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.619038 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: E0319 09:24:31.627608 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.637625 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.648932 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.660816 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.688591 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.723374 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.736804 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.749294 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.775333 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.786499 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.797346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.806902 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:31 crc kubenswrapper[4835]: I0319 09:24:31.823116 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.238662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b"} Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.238725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3"} Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.258368 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.279952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.297129 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.323354 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.338469 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.349239 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.359317 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.370803 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.393765 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.401076 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.401114 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.401519 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.401854 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.406891 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.413409 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.422992 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.435675 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.449935 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.459565 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.476482 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.489538 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.678021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.678136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.678172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678218 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.678179135 +0000 UTC m=+159.526777762 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678248 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678302 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.678286558 +0000 UTC m=+159.526885155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.678313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:32 crc kubenswrapper[4835]: I0319 09:24:32.678373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678387 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678401 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678413 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678441 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.678432512 +0000 UTC m=+159.527031109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678586 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678611 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678629 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678686 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.678671929 +0000 UTC m=+159.527270556 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678834 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:32 crc kubenswrapper[4835]: E0319 09:24:32.678879 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:25:04.678866415 +0000 UTC m=+159.527465032 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.243999 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a" exitCode=0 Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.245415 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a"} Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.282199 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.305060 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.342134 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.363878 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.382526 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.401471 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:33 crc kubenswrapper[4835]: E0319 09:24:33.401580 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.401675 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:33 crc kubenswrapper[4835]: E0319 09:24:33.401915 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.404106 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.418187 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.432385 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.445763 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.458858 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.479500 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.491582 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.507465 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.518643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.535053 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.551346 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:33 crc kubenswrapper[4835]: I0319 09:24:33.564417 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:33Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.250819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255242 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255263 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.255319 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.257086 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98" exitCode=0 Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.257168 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.258688 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fg29g" event={"ID":"4d251d68-4fd1-4d04-b960-260b36d78f3f","Type":"ContainerStarted","Data":"98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9"} Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.269858 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.303380 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.316959 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.330584 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.342661 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.354038 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.366050 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.383705 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.395335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.401428 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.401526 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:34 crc kubenswrapper[4835]: E0319 09:24:34.401549 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:34 crc kubenswrapper[4835]: E0319 09:24:34.401636 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.408171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.416821 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.429025 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.438182 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.451265 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.462044 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.475937 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.487065 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.499203 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.510627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.524195 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.539690 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.551632 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.580351 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.594054 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.606038 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.618942 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.628537 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.644176 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.658931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.669064 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.685779 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.700435 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.707280 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:34 crc kubenswrapper[4835]: E0319 09:24:34.707459 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:34 crc kubenswrapper[4835]: E0319 09:24:34.707524 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:24:50.707507337 +0000 UTC m=+145.556105934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.710472 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:34 crc kubenswrapper[4835]: I0319 09:24:34.724496 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.263887 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511" exitCode=0 Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.263949 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511"} Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.292058 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.304355 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.325430 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.338545 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.350628 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.363083 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.381744 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.401380 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.401509 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.401708 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.401788 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.408856 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.430205 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.442392 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.455805 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.467769 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.480119 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.497466 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.516450 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.529414 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.540673 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.613151 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.613414 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.613494 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.613582 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.613678 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:35Z","lastTransitionTime":"2026-03-19T09:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.626361 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.631337 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.631395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.631413 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.631442 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.631459 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:35Z","lastTransitionTime":"2026-03-19T09:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.645444 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.649626 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.649668 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.649680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.649697 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.649712 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:35Z","lastTransitionTime":"2026-03-19T09:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.669179 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.674150 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.674193 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.674201 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.674234 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.674245 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:35Z","lastTransitionTime":"2026-03-19T09:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.687650 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.691964 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.692028 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.692047 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.692073 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:35 crc kubenswrapper[4835]: I0319 09:24:35.692093 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:35Z","lastTransitionTime":"2026-03-19T09:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.742818 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:35 crc kubenswrapper[4835]: E0319 09:24:35.742933 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.270860 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0" exitCode=0 Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.270983 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0"} Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.298081 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.314321 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.337070 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.356611 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.374141 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.394502 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.402204 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:36 crc kubenswrapper[4835]: E0319 09:24:36.402542 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.402624 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:36 crc kubenswrapper[4835]: E0319 09:24:36.402884 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.417923 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.435486 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.455318 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.492572 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.521528 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.541607 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.564590 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.579325 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.601361 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.626663 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: E0319 09:24:36.628661 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.648293 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.667294 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.683079 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.694396 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.715284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.729342 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.749669 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.767628 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.782501 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.803171 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: E0319 09:24:36.819742 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c65689c_afdd_413c_92b9_bf02eeea000c.slice/crio-c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.822213 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.836714 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.852923 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.865214 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.879512 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.897155 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.914230 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:36 crc kubenswrapper[4835]: I0319 09:24:36.928324 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.282155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852"} Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.287366 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b" exitCode=0 Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.287427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b"} Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.315045 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.337094 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.356865 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.395791 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.401142 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:37 crc kubenswrapper[4835]: E0319 09:24:37.401234 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.401241 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:37 crc kubenswrapper[4835]: E0319 09:24:37.401432 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.409499 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.420011 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.429389 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.442125 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.458355 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.478539 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.488676 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.506971 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.518683 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.536372 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.553492 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.567332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.583156 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.639465 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.654407 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.667458 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.677894 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.691451 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.707797 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.720426 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.731946 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.741324 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.762422 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.777654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.791065 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.802871 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.818284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.834887 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.858009 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.874959 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:37 crc kubenswrapper[4835]: I0319 09:24:37.890849 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.294696 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59" exitCode=0 Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.294764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59"} Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.320284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.344425 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.363682 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.394442 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.401262 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.401377 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:38 crc kubenswrapper[4835]: E0319 09:24:38.401539 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:38 crc kubenswrapper[4835]: E0319 09:24:38.401691 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.415383 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.429332 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.445403 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.503046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.529404 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.542204 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.554712 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.565359 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.575785 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.590618 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.598253 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.609498 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:38 crc kubenswrapper[4835]: I0319 09:24:38.619300 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:38Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.364348 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c65689c-afdd-413c-92b9-bf02eeea000c" containerID="7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63" exitCode=0 Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.364439 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerDied","Data":"7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63"} Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.372888 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c"} Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.373413 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.384585 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.401566 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.401637 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:39 crc kubenswrapper[4835]: E0319 09:24:39.401808 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:39 crc kubenswrapper[4835]: E0319 09:24:39.402079 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.410453 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.427644 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.442772 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.445167 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.460996 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.478468 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.490831 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.503701 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.573522 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.586143 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.601145 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.618009 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.630297 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.642857 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.660466 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.677476 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.694544 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.707563 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.717366 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.734334 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.746510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.758645 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.769802 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.780699 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.796650 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.825383 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.841448 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.861571 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.872501 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.886931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.899574 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.915671 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.929936 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:39 crc kubenswrapper[4835]: I0319 09:24:39.947844 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:39Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.379895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" event={"ID":"0c65689c-afdd-413c-92b9-bf02eeea000c","Type":"ContainerStarted","Data":"a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda"} Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.380214 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.380243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.397975 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.401280 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:40 crc kubenswrapper[4835]: E0319 09:24:40.401421 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.401548 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:40 crc kubenswrapper[4835]: E0319 09:24:40.401626 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.411434 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.414160 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.426798 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.443397 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.460238 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.476209 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.492226 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.503133 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.518191 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.533697 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.549647 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.562596 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.577174 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.592479 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.606985 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.631344 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.648284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.671135 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.685396 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.746672 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.767057 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.778219 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.789640 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.800037 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.816467 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.827104 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.847643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.857433 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.867259 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.881820 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.893033 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.902416 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.913327 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:40 crc kubenswrapper[4835]: I0319 09:24:40.920890 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:40Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:41 crc kubenswrapper[4835]: I0319 09:24:41.401653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:41 crc kubenswrapper[4835]: I0319 09:24:41.401705 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:41 crc kubenswrapper[4835]: E0319 09:24:41.401879 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:41 crc kubenswrapper[4835]: E0319 09:24:41.402014 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:41 crc kubenswrapper[4835]: E0319 09:24:41.630283 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:42 crc kubenswrapper[4835]: I0319 09:24:42.401886 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:42 crc kubenswrapper[4835]: E0319 09:24:42.402063 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:42 crc kubenswrapper[4835]: I0319 09:24:42.402434 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:42 crc kubenswrapper[4835]: E0319 09:24:42.402620 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.394282 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/0.log" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.398915 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c" exitCode=1 Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.398968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c"} Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.400017 4835 scope.go:117] "RemoveContainer" containerID="cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.400880 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.400927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:43 crc kubenswrapper[4835]: E0319 09:24:43.404335 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:43 crc kubenswrapper[4835]: E0319 09:24:43.404638 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.422561 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.445240 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.466325 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.507643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.533286 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.552588 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.572915 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.590540 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.610842 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.641612 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:42Z\\\",\\\"message\\\":\\\"8] Removed *v1.Namespace event handler 5\\\\nI0319 09:24:41.973484 6789 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:24:41.973501 6789 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.973592 6789 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.974038 6789 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974192 6789 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974253 6789 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:24:41.974279 6789 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:24:41.974466 6789 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 09:24:41.974536 6789 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.975096 6789 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.658326 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.675224 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.695742 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.715250 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.730963 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.752510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:43 crc kubenswrapper[4835]: I0319 09:24:43.765374 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.400845 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.400890 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:44 crc kubenswrapper[4835]: E0319 09:24:44.400968 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:44 crc kubenswrapper[4835]: E0319 09:24:44.401119 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.405513 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/0.log" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.408783 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f"} Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.409163 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.421219 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.437801 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:42Z\\\",\\\"message\\\":\\\"8] Removed *v1.Namespace event handler 5\\\\nI0319 09:24:41.973484 6789 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:24:41.973501 6789 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.973592 6789 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.974038 6789 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974192 6789 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974253 6789 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:24:41.974279 6789 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:24:41.974466 6789 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 09:24:41.974536 6789 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.975096 6789 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.456005 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.474621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.488805 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.501569 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.515864 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.527244 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.544009 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.555654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.566216 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.578868 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.592845 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.611046 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.624155 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.634910 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:44 crc kubenswrapper[4835]: I0319 09:24:44.644930 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.401695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.401889 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.401955 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.402396 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.416302 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/1.log" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.418891 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/0.log" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.424233 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f" exitCode=1 Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.424355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f"} Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.424540 4835 scope.go:117] "RemoveContainer" containerID="cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.425342 4835 scope.go:117] "RemoveContainer" containerID="1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f" Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.425570 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.426817 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.452551 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.473582 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.488006 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.504496 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.529652 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.553513 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.569799 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.584281 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.596805 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.612420 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.636665 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:42Z\\\",\\\"message\\\":\\\"8] Removed *v1.Namespace event handler 5\\\\nI0319 09:24:41.973484 6789 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:24:41.973501 6789 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.973592 6789 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.974038 6789 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974192 6789 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974253 6789 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:24:41.974279 6789 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:24:41.974466 6789 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 09:24:41.974536 6789 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.975096 6789 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.655873 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.673691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.690453 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.712605 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.728512 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.748060 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.808485 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.808539 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.808556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.808577 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.808592 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:45Z","lastTransitionTime":"2026-03-19T09:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.828856 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.833043 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.833112 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.833130 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.833157 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.833175 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:45Z","lastTransitionTime":"2026-03-19T09:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.853710 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.858444 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.858481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.858493 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.858511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.858523 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:45Z","lastTransitionTime":"2026-03-19T09:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.871558 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.875421 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.875465 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.875476 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.875495 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.875511 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:45Z","lastTransitionTime":"2026-03-19T09:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.889386 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.892628 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.892666 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.892680 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.892700 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:45 crc kubenswrapper[4835]: I0319 09:24:45.892715 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:45Z","lastTransitionTime":"2026-03-19T09:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.907470 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:45 crc kubenswrapper[4835]: E0319 09:24:45.907628 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.401467 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.402300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:46 crc kubenswrapper[4835]: E0319 09:24:46.402434 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:46 crc kubenswrapper[4835]: E0319 09:24:46.403101 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.421020 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.421247 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.429896 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/1.log" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.434691 4835 scope.go:117] "RemoveContainer" containerID="1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f" Mar 19 09:24:46 crc kubenswrapper[4835]: E0319 09:24:46.435027 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.442467 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.462895 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.481928 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.497782 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.517647 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.540085 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbb47d8e4d633dda032d64f233f777a7b4b5ef713c8be4c119294371d04a740c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:42Z\\\",\\\"message\\\":\\\"8] Removed *v1.Namespace event handler 5\\\\nI0319 09:24:41.973484 6789 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:24:41.973501 6789 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.973592 6789 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 09:24:41.974038 6789 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974192 6789 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.974253 6789 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:24:41.974279 6789 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:24:41.974466 6789 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 09:24:41.974536 6789 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:24:41.975096 6789 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.561586 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.575912 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.600770 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.616003 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: E0319 09:24:46.631205 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.633904 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.649421 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.663404 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.685302 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.704524 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.715381 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.723187 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.731574 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.743906 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.765082 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.802039 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.873199 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.884928 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.895108 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.904837 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.917398 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.926300 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.934656 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.950689 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.962643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.979284 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:46 crc kubenswrapper[4835]: I0319 09:24:46.991183 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.000992 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.011700 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.028222 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.036149 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.401017 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:47 crc kubenswrapper[4835]: I0319 09:24:47.401086 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:47 crc kubenswrapper[4835]: E0319 09:24:47.401260 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:47 crc kubenswrapper[4835]: E0319 09:24:47.401393 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:48 crc kubenswrapper[4835]: I0319 09:24:48.401102 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:48 crc kubenswrapper[4835]: I0319 09:24:48.401185 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:48 crc kubenswrapper[4835]: E0319 09:24:48.401340 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:48 crc kubenswrapper[4835]: E0319 09:24:48.401492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:49 crc kubenswrapper[4835]: I0319 09:24:49.400928 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:49 crc kubenswrapper[4835]: I0319 09:24:49.401024 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:49 crc kubenswrapper[4835]: E0319 09:24:49.401166 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:49 crc kubenswrapper[4835]: E0319 09:24:49.401224 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:50 crc kubenswrapper[4835]: I0319 09:24:50.401446 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:50 crc kubenswrapper[4835]: E0319 09:24:50.401595 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:50 crc kubenswrapper[4835]: I0319 09:24:50.401952 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:50 crc kubenswrapper[4835]: E0319 09:24:50.402015 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:50 crc kubenswrapper[4835]: I0319 09:24:50.726179 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:50 crc kubenswrapper[4835]: E0319 09:24:50.726364 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:50 crc kubenswrapper[4835]: E0319 09:24:50.726441 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:25:22.726418247 +0000 UTC m=+177.575016844 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:24:51 crc kubenswrapper[4835]: I0319 09:24:51.401327 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:51 crc kubenswrapper[4835]: I0319 09:24:51.401327 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:51 crc kubenswrapper[4835]: E0319 09:24:51.401552 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:51 crc kubenswrapper[4835]: E0319 09:24:51.401653 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:51 crc kubenswrapper[4835]: E0319 09:24:51.632222 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:52 crc kubenswrapper[4835]: I0319 09:24:52.401136 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:52 crc kubenswrapper[4835]: I0319 09:24:52.401205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:52 crc kubenswrapper[4835]: E0319 09:24:52.401329 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:52 crc kubenswrapper[4835]: E0319 09:24:52.401479 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:53 crc kubenswrapper[4835]: I0319 09:24:53.401889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:53 crc kubenswrapper[4835]: I0319 09:24:53.401983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:53 crc kubenswrapper[4835]: E0319 09:24:53.402080 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:53 crc kubenswrapper[4835]: E0319 09:24:53.402156 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:54 crc kubenswrapper[4835]: I0319 09:24:54.401779 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:54 crc kubenswrapper[4835]: I0319 09:24:54.401840 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:54 crc kubenswrapper[4835]: E0319 09:24:54.401964 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:54 crc kubenswrapper[4835]: E0319 09:24:54.402104 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:55 crc kubenswrapper[4835]: I0319 09:24:55.400896 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:55 crc kubenswrapper[4835]: I0319 09:24:55.401000 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:55 crc kubenswrapper[4835]: E0319 09:24:55.401079 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:55 crc kubenswrapper[4835]: E0319 09:24:55.401170 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.292431 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.292507 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.292525 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.292550 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.292568 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:56Z","lastTransitionTime":"2026-03-19T09:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.313313 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.318275 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.318340 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.318362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.318388 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.318407 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:56Z","lastTransitionTime":"2026-03-19T09:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.338330 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.343329 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.343390 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.343410 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.343436 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.343459 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:56Z","lastTransitionTime":"2026-03-19T09:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.364206 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.369227 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.369291 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.369309 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.369334 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.369350 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:56Z","lastTransitionTime":"2026-03-19T09:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.389110 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.394399 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.394456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.394474 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.394498 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.394514 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:24:56Z","lastTransitionTime":"2026-03-19T09:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.401264 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.401290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.401463 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.401621 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.418807 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.418578 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.419057 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.443375 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.459577 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.480391 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.499687 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.519371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.540434 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.560791 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.581573 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.598437 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.614167 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: E0319 09:24:56.632859 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.651465 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.667515 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.682088 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.701373 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.714292 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.732736 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.756826 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:56 crc kubenswrapper[4835]: I0319 09:24:56.768360 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:56Z is after 2025-08-24T17:21:41Z" Mar 19 09:24:57 crc kubenswrapper[4835]: I0319 09:24:57.401440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:57 crc kubenswrapper[4835]: I0319 09:24:57.401587 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:57 crc kubenswrapper[4835]: E0319 09:24:57.401668 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:57 crc kubenswrapper[4835]: E0319 09:24:57.401789 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:58 crc kubenswrapper[4835]: I0319 09:24:58.401994 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:24:58 crc kubenswrapper[4835]: I0319 09:24:58.402077 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:24:58 crc kubenswrapper[4835]: E0319 09:24:58.402222 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:24:58 crc kubenswrapper[4835]: E0319 09:24:58.402356 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:24:59 crc kubenswrapper[4835]: I0319 09:24:59.401554 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:24:59 crc kubenswrapper[4835]: E0319 09:24:59.401681 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:24:59 crc kubenswrapper[4835]: I0319 09:24:59.401555 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:24:59 crc kubenswrapper[4835]: E0319 09:24:59.402185 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:24:59 crc kubenswrapper[4835]: I0319 09:24:59.402799 4835 scope.go:117] "RemoveContainer" containerID="1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.401843 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:00 crc kubenswrapper[4835]: E0319 09:25:00.402181 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.401854 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:00 crc kubenswrapper[4835]: E0319 09:25:00.402260 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.492956 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/2.log" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.495068 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/1.log" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.499621 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18" exitCode=1 Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.499675 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18"} Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.499723 4835 scope.go:117] "RemoveContainer" containerID="1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.501135 4835 scope.go:117] "RemoveContainer" containerID="2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18" Mar 19 09:25:00 crc kubenswrapper[4835]: E0319 09:25:00.501465 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.537804 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.556331 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.577554 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.596201 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.614909 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.631510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.651178 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.670780 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.685997 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.709867 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.726798 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.746113 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.766075 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.786027 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.805572 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.830777 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.845543 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.855995 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:00 crc kubenswrapper[4835]: I0319 09:25:00.865070 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:00Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:01 crc kubenswrapper[4835]: I0319 09:25:01.401415 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:01 crc kubenswrapper[4835]: I0319 09:25:01.401460 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:01 crc kubenswrapper[4835]: E0319 09:25:01.402229 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:01 crc kubenswrapper[4835]: E0319 09:25:01.402417 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:01 crc kubenswrapper[4835]: I0319 09:25:01.506621 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/2.log" Mar 19 09:25:01 crc kubenswrapper[4835]: E0319 09:25:01.634605 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:02 crc kubenswrapper[4835]: I0319 09:25:02.400996 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:02 crc kubenswrapper[4835]: I0319 09:25:02.401211 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:02 crc kubenswrapper[4835]: E0319 09:25:02.401381 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:02 crc kubenswrapper[4835]: E0319 09:25:02.401596 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:03 crc kubenswrapper[4835]: I0319 09:25:03.401444 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:03 crc kubenswrapper[4835]: I0319 09:25:03.401543 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:03 crc kubenswrapper[4835]: E0319 09:25:03.401640 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:03 crc kubenswrapper[4835]: E0319 09:25:03.401734 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.401104 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.401731 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.401175 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.402351 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.689624 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.689840 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:08.689807518 +0000 UTC m=+223.538406145 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.689897 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.689956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.690002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:04 crc kubenswrapper[4835]: I0319 09:25:04.690070 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690185 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690223 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690256 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690265 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:08.69024491 +0000 UTC m=+223.538843537 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690275 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690325 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:08.690309362 +0000 UTC m=+223.538907989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690434 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690486 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690508 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.690665 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.691066 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:08.69059735 +0000 UTC m=+223.539195977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:25:04 crc kubenswrapper[4835]: E0319 09:25:04.691148 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:08.691121745 +0000 UTC m=+223.539720482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:25:05 crc kubenswrapper[4835]: I0319 09:25:05.401925 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:05 crc kubenswrapper[4835]: I0319 09:25:05.402010 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:05 crc kubenswrapper[4835]: E0319 09:25:05.402169 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:05 crc kubenswrapper[4835]: E0319 09:25:05.402295 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.400977 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.400995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.401101 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.401243 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.420319 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.438845 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.456543 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.470246 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.475505 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.475540 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.475556 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.475574 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.475585 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:06Z","lastTransitionTime":"2026-03-19T09:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.495926 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.503395 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.503456 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.503481 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.503511 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.503532 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:06Z","lastTransitionTime":"2026-03-19T09:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.507197 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.525263 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531192 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531240 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531258 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531281 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531301 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:06Z","lastTransitionTime":"2026-03-19T09:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.531510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.549371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.553103 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.557821 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.558048 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.558200 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.558339 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.558479 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:06Z","lastTransitionTime":"2026-03-19T09:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.566789 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.578437 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.582911 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.582974 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.583001 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.583034 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.583058 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:06Z","lastTransitionTime":"2026-03-19T09:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.595883 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1baf8b123f2a2238d327b6c94655d6432a075232fbb9bb456dbb7931cc219e4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:24:44Z\\\",\\\"message\\\":\\\"node-qk6hn\\\\nI0319 09:24:44.528470 6951 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0319 09:24:44.528450 6951 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-vs6hx] creating logical port openshift-multus_network-metrics-daemon-vs6hx for pod on switch crc\\\\nI0319 09:24:44.528477 6951 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0319 09:24:44.528318 6951 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:24:44Z is a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.604407 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.604555 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.612309 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.634974 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.635364 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.652059 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.667514 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.683913 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.703816 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.726616 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.741519 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.765202 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.782682 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.786884 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.791515 4835 scope.go:117] "RemoveContainer" containerID="2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18" Mar 19 09:25:06 crc kubenswrapper[4835]: E0319 09:25:06.791905 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.814157 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.832772 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.849976 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.881922 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.900455 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.917946 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.933654 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.947020 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:06 crc kubenswrapper[4835]: I0319 09:25:06.971182 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.001335 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:06Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.015609 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.028980 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.049694 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.066042 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.086946 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.105893 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.123984 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.137586 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.152434 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:07Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.401638 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:07 crc kubenswrapper[4835]: E0319 09:25:07.401897 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:07 crc kubenswrapper[4835]: I0319 09:25:07.402518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:07 crc kubenswrapper[4835]: E0319 09:25:07.402818 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:08 crc kubenswrapper[4835]: I0319 09:25:08.401486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:08 crc kubenswrapper[4835]: I0319 09:25:08.401590 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:08 crc kubenswrapper[4835]: E0319 09:25:08.402492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:08 crc kubenswrapper[4835]: E0319 09:25:08.402733 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:09 crc kubenswrapper[4835]: I0319 09:25:09.401686 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:09 crc kubenswrapper[4835]: I0319 09:25:09.401728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:09 crc kubenswrapper[4835]: E0319 09:25:09.401989 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:09 crc kubenswrapper[4835]: E0319 09:25:09.402018 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:10 crc kubenswrapper[4835]: I0319 09:25:10.400949 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:10 crc kubenswrapper[4835]: I0319 09:25:10.400995 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:10 crc kubenswrapper[4835]: E0319 09:25:10.401087 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:10 crc kubenswrapper[4835]: E0319 09:25:10.401168 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:11 crc kubenswrapper[4835]: I0319 09:25:11.401177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:11 crc kubenswrapper[4835]: I0319 09:25:11.401282 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:11 crc kubenswrapper[4835]: E0319 09:25:11.401401 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:11 crc kubenswrapper[4835]: E0319 09:25:11.401853 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:11 crc kubenswrapper[4835]: E0319 09:25:11.636496 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:12 crc kubenswrapper[4835]: I0319 09:25:12.402101 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:12 crc kubenswrapper[4835]: I0319 09:25:12.402192 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:12 crc kubenswrapper[4835]: E0319 09:25:12.402402 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:12 crc kubenswrapper[4835]: E0319 09:25:12.402570 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:13 crc kubenswrapper[4835]: I0319 09:25:13.401606 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:13 crc kubenswrapper[4835]: I0319 09:25:13.401658 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:13 crc kubenswrapper[4835]: E0319 09:25:13.401835 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:13 crc kubenswrapper[4835]: E0319 09:25:13.401974 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:14 crc kubenswrapper[4835]: I0319 09:25:14.401125 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:14 crc kubenswrapper[4835]: I0319 09:25:14.401155 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:14 crc kubenswrapper[4835]: E0319 09:25:14.401328 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:14 crc kubenswrapper[4835]: E0319 09:25:14.401518 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:15 crc kubenswrapper[4835]: I0319 09:25:15.401363 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:15 crc kubenswrapper[4835]: I0319 09:25:15.401423 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:15 crc kubenswrapper[4835]: E0319 09:25:15.401555 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:15 crc kubenswrapper[4835]: E0319 09:25:15.401691 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.401806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.401817 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.402050 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.402205 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.423236 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.444163 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.464893 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.483952 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.523546 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.540637 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.558169 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.574986 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.593216 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.622246 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.638015 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.640136 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.661616 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.681146 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.686283 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.686338 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.686355 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.686380 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.686397 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:16Z","lastTransitionTime":"2026-03-19T09:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.702549 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.706227 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.711503 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.711575 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.711600 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.711630 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.711646 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:16Z","lastTransitionTime":"2026-03-19T09:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.720967 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.728544 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.733209 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.733260 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.733276 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.733299 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.733318 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:16Z","lastTransitionTime":"2026-03-19T09:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.740912 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.752004 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.755583 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.757489 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.757548 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.757572 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.757602 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.757624 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:16Z","lastTransitionTime":"2026-03-19T09:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.776815 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.780571 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.782249 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.782344 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.782363 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.782391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.782413 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:16Z","lastTransitionTime":"2026-03-19T09:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:16 crc kubenswrapper[4835]: I0319 09:25:16.799390 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.802222 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:16Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:16 crc kubenswrapper[4835]: E0319 09:25:16.802575 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:25:17 crc kubenswrapper[4835]: I0319 09:25:17.401233 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:17 crc kubenswrapper[4835]: E0319 09:25:17.401409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:17 crc kubenswrapper[4835]: I0319 09:25:17.401823 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:17 crc kubenswrapper[4835]: E0319 09:25:17.402150 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.401063 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.401139 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:18 crc kubenswrapper[4835]: E0319 09:25:18.401297 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:18 crc kubenswrapper[4835]: E0319 09:25:18.401437 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.575703 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/0.log" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.575863 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ee35aaa-2819-432a-af95-f1078ad836fc" containerID="9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e" exitCode=1 Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.575930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerDied","Data":"9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e"} Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.576596 4835 scope.go:117] "RemoveContainer" containerID="9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.614473 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.639703 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.657561 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.675063 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.693418 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.713810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.730223 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.747889 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.761946 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.781782 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.810951 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.832961 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.849579 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.874972 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.892672 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.911276 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.931529 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.949873 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:18 crc kubenswrapper[4835]: I0319 09:25:18.968921 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:18Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.400948 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.400964 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:19 crc kubenswrapper[4835]: E0319 09:25:19.401129 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:19 crc kubenswrapper[4835]: E0319 09:25:19.401273 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.581899 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/0.log" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.581962 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerStarted","Data":"db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124"} Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.594644 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.614345 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.632788 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.650821 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.667979 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.686973 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.712665 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.735176 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.753241 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.779592 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.796935 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.814880 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.836377 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.858320 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.876631 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.914789 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.942652 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.960371 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:19 crc kubenswrapper[4835]: I0319 09:25:19.980404 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:19Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:20 crc kubenswrapper[4835]: I0319 09:25:20.400957 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:20 crc kubenswrapper[4835]: I0319 09:25:20.400979 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:20 crc kubenswrapper[4835]: E0319 09:25:20.401197 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:20 crc kubenswrapper[4835]: E0319 09:25:20.401343 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.401850 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.401886 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:21 crc kubenswrapper[4835]: E0319 09:25:21.402030 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:21 crc kubenswrapper[4835]: E0319 09:25:21.402118 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.403131 4835 scope.go:117] "RemoveContainer" containerID="2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.593789 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/2.log" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.599350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a"} Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.600100 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.624237 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: E0319 09:25:21.638914 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.652056 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.670304 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.694322 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.722257 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.743145 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.768692 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.784001 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.797683 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.808804 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.821086 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.839736 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.858627 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.874128 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.889619 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.914266 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.927273 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.942621 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:21 crc kubenswrapper[4835]: I0319 09:25:21.959729 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:21Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.401830 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.401880 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:22 crc kubenswrapper[4835]: E0319 09:25:22.402450 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:22 crc kubenswrapper[4835]: E0319 09:25:22.402289 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.606707 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/3.log" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.607537 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/2.log" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.611644 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" exitCode=1 Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.611699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a"} Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.611788 4835 scope.go:117] "RemoveContainer" containerID="2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.612826 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:25:22 crc kubenswrapper[4835]: E0319 09:25:22.613294 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.630178 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.650943 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.670146 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.689186 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.708701 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.732643 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.735247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:22 crc kubenswrapper[4835]: E0319 09:25:22.735434 4835 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:25:22 crc kubenswrapper[4835]: E0319 09:25:22.735547 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs podName:7f0101ce-52a3-4e5b-8fcd-c19020fb071a nodeName:}" failed. No retries permitted until 2026-03-19 09:26:26.735522107 +0000 UTC m=+241.584120734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs") pod "network-metrics-daemon-vs6hx" (UID: "7f0101ce-52a3-4e5b-8fcd-c19020fb071a") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.768514 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed88994871ac881b33094d78fabb7c871049cfa06f98945136b252eea687b18\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:00Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 09:25:00.377585 7149 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:00.377602 7149 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:00.377641 7149 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:00.377666 7149 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:00.377712 7149 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:00.377719 7149 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:00.377770 7149 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:00.377791 7149 factory.go:656] Stopping watch factory\\\\nI0319 09:25:00.377814 7149 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 09:25:00.377827 7149 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:00.377838 7149 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:00.377847 7149 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:00.377857 7149 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:00.377867 7149 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:00.377883 7149 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:22Z\\\",\\\"message\\\":\\\" 7410 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 09:25:22.495871 7410 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:22.495900 7410 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:22.495905 7410 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:22.495928 7410 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:22.495959 7410 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:22.495971 7410 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:22.495984 7410 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 09:25:22.495986 7410 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:22.496005 7410 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:22.496004 7410 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:22.496019 7410 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:22.496028 7410 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:22.496039 7410 factory.go:656] Stopping watch factory\\\\nI0319 09:25:22.496045 7410 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:22.496051 7410 ovnkube.go:599] Stopped ovnkube\\\\nI0319 09:25:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.791380 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.808640 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.832948 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.851223 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.870091 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.893892 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.913606 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.930790 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.965828 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:22 crc kubenswrapper[4835]: I0319 09:25:22.990309 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:22Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.008467 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.025531 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.401879 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.401891 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:23 crc kubenswrapper[4835]: E0319 09:25:23.402071 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:23 crc kubenswrapper[4835]: E0319 09:25:23.402256 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.618376 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/3.log" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.623811 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:25:23 crc kubenswrapper[4835]: E0319 09:25:23.624099 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.656815 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.680927 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.701376 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.716500 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.739434 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:22Z\\\",\\\"message\\\":\\\" 7410 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 09:25:22.495871 7410 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:22.495900 7410 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:22.495905 7410 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:22.495928 7410 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:22.495959 7410 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:22.495971 7410 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:22.495984 7410 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 09:25:22.495986 7410 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:22.496005 7410 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:22.496004 7410 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:22.496019 7410 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:22.496028 7410 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:22.496039 7410 factory.go:656] Stopping watch factory\\\\nI0319 09:25:22.496045 7410 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:22.496051 7410 ovnkube.go:599] Stopped ovnkube\\\\nI0319 09:25:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.756955 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.777122 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.793855 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.813331 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.830349 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.851530 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.874931 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.890700 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.914286 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.927947 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.944688 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.963258 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.979999 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:23 crc kubenswrapper[4835]: I0319 09:25:23.999001 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:23Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:24 crc kubenswrapper[4835]: I0319 09:25:24.401454 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:24 crc kubenswrapper[4835]: E0319 09:25:24.401672 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:24 crc kubenswrapper[4835]: I0319 09:25:24.402046 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:24 crc kubenswrapper[4835]: E0319 09:25:24.402223 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:25 crc kubenswrapper[4835]: I0319 09:25:25.401239 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:25 crc kubenswrapper[4835]: I0319 09:25:25.401260 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:25 crc kubenswrapper[4835]: E0319 09:25:25.401463 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:25 crc kubenswrapper[4835]: E0319 09:25:25.401640 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.401127 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.401127 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:26 crc kubenswrapper[4835]: E0319 09:25:26.401378 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:26 crc kubenswrapper[4835]: E0319 09:25:26.401506 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.422957 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.443701 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.464321 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.496333 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:22Z\\\",\\\"message\\\":\\\" 7410 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 09:25:22.495871 7410 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:22.495900 7410 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:22.495905 7410 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:22.495928 7410 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:22.495959 7410 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:22.495971 7410 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:22.495984 7410 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 09:25:22.495986 7410 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:22.496005 7410 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:22.496004 7410 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:22.496019 7410 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:22.496028 7410 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:22.496039 7410 factory.go:656] Stopping watch factory\\\\nI0319 09:25:22.496045 7410 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:22.496051 7410 ovnkube.go:599] Stopped ovnkube\\\\nI0319 09:25:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.516111 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.535911 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.556699 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.573552 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.593070 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.605785 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.628927 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: E0319 09:25:26.640068 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.648933 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.671965 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.693097 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.715609 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.732252 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.769109 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.791365 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:26 crc kubenswrapper[4835]: I0319 09:25:26.804180 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:26Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.046854 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.046916 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.046933 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.046957 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.046975 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:27Z","lastTransitionTime":"2026-03-19T09:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.065257 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:27Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.070724 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.070807 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.070824 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.070847 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.070864 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:27Z","lastTransitionTime":"2026-03-19T09:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.092269 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:27Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.097725 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.097831 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.097883 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.097944 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.097964 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:27Z","lastTransitionTime":"2026-03-19T09:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.115368 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:27Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.120077 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.120166 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.120187 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.120207 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.120261 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:27Z","lastTransitionTime":"2026-03-19T09:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.138397 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:27Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.143324 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.143362 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.143374 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.143391 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.143403 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:27Z","lastTransitionTime":"2026-03-19T09:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.159580 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:27Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.159702 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.401869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:27 crc kubenswrapper[4835]: I0319 09:25:27.401883 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.402069 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:27 crc kubenswrapper[4835]: E0319 09:25:27.402228 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:28 crc kubenswrapper[4835]: I0319 09:25:28.401620 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:28 crc kubenswrapper[4835]: E0319 09:25:28.401869 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:28 crc kubenswrapper[4835]: I0319 09:25:28.401942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:28 crc kubenswrapper[4835]: E0319 09:25:28.402134 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:29 crc kubenswrapper[4835]: I0319 09:25:29.401891 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:29 crc kubenswrapper[4835]: I0319 09:25:29.401915 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:29 crc kubenswrapper[4835]: E0319 09:25:29.402090 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:29 crc kubenswrapper[4835]: E0319 09:25:29.402266 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:30 crc kubenswrapper[4835]: I0319 09:25:30.401406 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:30 crc kubenswrapper[4835]: I0319 09:25:30.401527 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:30 crc kubenswrapper[4835]: E0319 09:25:30.401613 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:30 crc kubenswrapper[4835]: E0319 09:25:30.401734 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:31 crc kubenswrapper[4835]: I0319 09:25:31.401247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:31 crc kubenswrapper[4835]: E0319 09:25:31.401507 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:31 crc kubenswrapper[4835]: I0319 09:25:31.401247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:31 crc kubenswrapper[4835]: E0319 09:25:31.401860 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:31 crc kubenswrapper[4835]: E0319 09:25:31.642157 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:32 crc kubenswrapper[4835]: I0319 09:25:32.401893 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:32 crc kubenswrapper[4835]: I0319 09:25:32.401969 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:32 crc kubenswrapper[4835]: E0319 09:25:32.403150 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:32 crc kubenswrapper[4835]: E0319 09:25:32.403234 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:33 crc kubenswrapper[4835]: I0319 09:25:33.401126 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:33 crc kubenswrapper[4835]: I0319 09:25:33.401126 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:33 crc kubenswrapper[4835]: E0319 09:25:33.401299 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:33 crc kubenswrapper[4835]: E0319 09:25:33.401407 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:34 crc kubenswrapper[4835]: I0319 09:25:34.401351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:34 crc kubenswrapper[4835]: I0319 09:25:34.401404 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:34 crc kubenswrapper[4835]: E0319 09:25:34.402220 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:34 crc kubenswrapper[4835]: E0319 09:25:34.402396 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:35 crc kubenswrapper[4835]: I0319 09:25:35.401608 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:35 crc kubenswrapper[4835]: E0319 09:25:35.403028 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:35 crc kubenswrapper[4835]: I0319 09:25:35.401642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:35 crc kubenswrapper[4835]: E0319 09:25:35.403178 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.401354 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.401458 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:36 crc kubenswrapper[4835]: E0319 09:25:36.401528 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:36 crc kubenswrapper[4835]: E0319 09:25:36.401643 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.422587 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.440880 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.458510 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.480168 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.511954 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:22Z\\\",\\\"message\\\":\\\" 7410 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 09:25:22.495871 7410 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:22.495900 7410 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:22.495905 7410 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:22.495928 7410 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:22.495959 7410 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:22.495971 7410 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:22.495984 7410 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 09:25:22.495986 7410 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:22.496005 7410 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:22.496004 7410 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:22.496019 7410 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:22.496028 7410 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:22.496039 7410 factory.go:656] Stopping watch factory\\\\nI0319 09:25:22.496045 7410 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:22.496051 7410 ovnkube.go:599] Stopped ovnkube\\\\nI0319 09:25:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.529343 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.549592 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.572461 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.589274 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.618609 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.636369 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: E0319 09:25:36.643225 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.657256 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.678440 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.697301 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.713165 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.730898 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.748546 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.782115 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:36 crc kubenswrapper[4835]: I0319 09:25:36.804891 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:36Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.401854 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.401868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.402052 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.402165 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.495268 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.495334 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.495351 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.495376 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.495393 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:37Z","lastTransitionTime":"2026-03-19T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.517290 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.522975 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.523039 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.523056 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.523082 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.523101 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:37Z","lastTransitionTime":"2026-03-19T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.544561 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.550156 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.550210 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.550227 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.550256 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.550276 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:37Z","lastTransitionTime":"2026-03-19T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.571058 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.575997 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.576061 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.576078 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.576107 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.576129 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:37Z","lastTransitionTime":"2026-03-19T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.597994 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.603198 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.603253 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.603271 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.603294 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:37 crc kubenswrapper[4835]: I0319 09:25:37.603311 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:37Z","lastTransitionTime":"2026-03-19T09:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.623271 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:37Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:37 crc kubenswrapper[4835]: E0319 09:25:37.623527 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:25:38 crc kubenswrapper[4835]: I0319 09:25:38.401194 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:38 crc kubenswrapper[4835]: I0319 09:25:38.401216 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:38 crc kubenswrapper[4835]: E0319 09:25:38.401815 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:38 crc kubenswrapper[4835]: E0319 09:25:38.402042 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:38 crc kubenswrapper[4835]: I0319 09:25:38.402316 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:25:38 crc kubenswrapper[4835]: E0319 09:25:38.402618 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:39 crc kubenswrapper[4835]: I0319 09:25:39.401412 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:39 crc kubenswrapper[4835]: I0319 09:25:39.401510 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:39 crc kubenswrapper[4835]: E0319 09:25:39.401795 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:39 crc kubenswrapper[4835]: E0319 09:25:39.402049 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:40 crc kubenswrapper[4835]: I0319 09:25:40.401843 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:40 crc kubenswrapper[4835]: I0319 09:25:40.401999 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:40 crc kubenswrapper[4835]: E0319 09:25:40.402056 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:40 crc kubenswrapper[4835]: E0319 09:25:40.402222 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:41 crc kubenswrapper[4835]: I0319 09:25:41.401223 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:41 crc kubenswrapper[4835]: I0319 09:25:41.401335 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:41 crc kubenswrapper[4835]: E0319 09:25:41.401679 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:41 crc kubenswrapper[4835]: E0319 09:25:41.401970 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:41 crc kubenswrapper[4835]: E0319 09:25:41.644642 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:42 crc kubenswrapper[4835]: I0319 09:25:42.401947 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:42 crc kubenswrapper[4835]: I0319 09:25:42.401972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:42 crc kubenswrapper[4835]: E0319 09:25:42.402255 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:42 crc kubenswrapper[4835]: E0319 09:25:42.402387 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:43 crc kubenswrapper[4835]: I0319 09:25:43.400929 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:43 crc kubenswrapper[4835]: I0319 09:25:43.400968 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:43 crc kubenswrapper[4835]: E0319 09:25:43.401138 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:43 crc kubenswrapper[4835]: E0319 09:25:43.401287 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:44 crc kubenswrapper[4835]: I0319 09:25:44.401658 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:44 crc kubenswrapper[4835]: I0319 09:25:44.401773 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:44 crc kubenswrapper[4835]: E0319 09:25:44.401816 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:44 crc kubenswrapper[4835]: E0319 09:25:44.401933 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:45 crc kubenswrapper[4835]: I0319 09:25:45.401262 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:45 crc kubenswrapper[4835]: I0319 09:25:45.401303 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:45 crc kubenswrapper[4835]: E0319 09:25:45.401488 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:45 crc kubenswrapper[4835]: E0319 09:25:45.401633 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.401089 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.401176 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:46 crc kubenswrapper[4835]: E0319 09:25:46.401409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:46 crc kubenswrapper[4835]: E0319 09:25:46.401552 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.425009 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b56d6f1-6518-4341-8c9d-3026798d33ca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:23:37Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 09:23:37.034535 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 09:23:37.034821 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 09:23:37.036051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2184523076/tls.crt::/tmp/serving-cert-2184523076/tls.key\\\\\\\"\\\\nI0319 09:23:37.595454 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 09:23:37.601391 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 09:23:37.601431 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 09:23:37.601469 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 09:23:37.601483 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 09:23:37.609665 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 09:23:37.609688 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 09:23:37.609696 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609713 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 09:23:37.609719 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 09:23:37.609722 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 09:23:37.609725 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 09:23:37.609728 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 09:23:37.613936 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:23:36Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.444295 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"417fb0b4-abf2-4fec-abfe-70a08c00f899\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3af08613af9e9ce832ae101fe6390a06b5d21e92c729b8a7a2f4312aca0399c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbabb5352299896d9918921311f30796fd6a01ee141e1cc9e172fac1dc3560e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tfkvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-p4qwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.461822 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lr99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:18Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vs6hx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.494976 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85a46a2a-d38c-4cbf-a5b6-f2a99a37cfed\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68eb9384b4c98434890ae706fde77bb17ef6c67f418add49fdfebd1971cae7d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cbfee48562033dc59daa82d86cce16438625819b5a07404668905249dc376ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18ef5ea005c7545b1a8ab2121b205db3a23a33cfd0d6338026c5fcfc9cbb4f5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d83abe8f08575812d0aea8ff0c001349c0d3894b9e423233a63cca103e8328f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc9d1681f8c31d01194c4359ab9d78ba6ad193b29666e077fa7556e7b907d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7cd3fd4e67a88e72489187d17804caf0626b50672f98556f6af1b7bd14a383fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8e85fcdb72f80936fdb91e7cd9b0cb7919f252e7a82be42b4630412468477d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0f08613dd910957aa11f8015ce3bba1a9bc5846baac4b871bae0b4810714fa5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.519722 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2b9b2cb-6d1b-4ba9-8866-b38df3b492c2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbde07abe8d8fcd4aa07c20eb6bc852eee70bf207f4ff2bab8ab1332261c92c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55c2419d07ea971ebef8f942b93109e8d28ae31053fd3fc750d478868fa36ab9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T09:22:58Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 09:22:28.939310 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 09:22:28.941752 1 observer_polling.go:159] Starting file observer\\\\nI0319 09:22:29.005366 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 09:22:29.017046 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 09:22:58.844936 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 09:22:58.845102 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:22:58Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://08173aab5836c52b21e70067b1208d91ef8bdd7e02fbb72cb533d41616176913\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fb7c70aad1d13cf3432d05f17c527f900404c5f1a389c7174b400fab9f3fcd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.541542 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.557555 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d4a1069eb557a1baab69a634da9dfccca4f29814dcb122906e9d5a3a817696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.573075 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"adf367e5-fedd-4d9e-a7af-345df1f08353\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46acc963d37712862e26950ee87ff20397172ac456efa85d6fbd0332c4b0f3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8s9t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bk84k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.588387 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jl5x4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ee35aaa-2819-432a-af95-f1078ad836fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:17Z\\\",\\\"message\\\":\\\"2026-03-19T09:24:32+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547\\\\n2026-03-19T09:24:32+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bc8ab82d-33d0-4360-a202-1dc5b41fa547 to /host/opt/cni/bin/\\\\n2026-03-19T09:24:32Z [verbose] multus-daemon started\\\\n2026-03-19T09:24:32Z [verbose] Readiness Indicator file check\\\\n2026-03-19T09:25:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:25:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmtx5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jl5x4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.608810 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T09:25:22Z\\\",\\\"message\\\":\\\" 7410 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0319 09:25:22.495871 7410 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 09:25:22.495900 7410 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 09:25:22.495905 7410 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 09:25:22.495928 7410 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 09:25:22.495959 7410 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 09:25:22.495971 7410 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 09:25:22.495984 7410 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0319 09:25:22.495986 7410 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 09:25:22.496005 7410 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 09:25:22.496004 7410 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 09:25:22.496019 7410 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 09:25:22.496028 7410 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 09:25:22.496039 7410 factory.go:656] Stopping watch factory\\\\nI0319 09:25:22.496045 7410 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 09:25:22.496051 7410 ovnkube.go:599] Stopped ovnkube\\\\nI0319 09:25:2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T09:25:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48r2z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qk6hn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.622595 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7faeeaa2-30ca-4d00-a512-fe9b5678b9e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f10145a7c5746baaa9a782ce65b84c18ba6ce86c26d70a8d8a20334124e2f2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f343c367c83726e1b54c7bb5e38c87226a39801dfb5d589b7eb9f5e5ac8c8162\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.633669 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fg29g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d251d68-4fd1-4d04-b960-260b36d78f3f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98298fa62b16ce931a7d5f7439859452ab2d93d0abbb027c312189e7546774e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-797kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fg29g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: E0319 09:25:46.645378 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.655603 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-lkntj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c65689c-afdd-413c-92b9-bf02eeea000c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a004978dc146f2dcf3ff7f2390b88ed277647d48ebae20fd5d68708536f6cfda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beff5e37cae0750a4828b6d818f25ea29e8d6e7902b779405453cd1e4873ca98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3bbc5353b20b9141a174d17f069ea7d681c42225a4aedf01be76f68db91f511\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4b1a94989494dcdf4718b35b11b09745b28ef11376aae47d80935cf4a3a77f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5218923d0174ffe6d009b6b09eb29ac0ab4f9bf312ed6e5aac991431540349b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://852808bfb4ef1a9b9f31162da33fb92d6a77bc2b7d22e7783775cfa07d653d59\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7797bf5cc98e3f42d6617a31fdd0e385c798769c48d8719fc8f0a85f92df2c63\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:24:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:24:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9nz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-lkntj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.671045 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ppv6m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dfdfe13-1f47-4774-89d0-5d861607ddbc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16ccd2e3ab61ffbbb33a359428c26afc748e11f9c39a4047a8b600002b408315\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kt4lg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:24:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ppv6m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.689691 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d294b374529227ebcd9ef645c5bd51cb0d891d0740b734d56e0e4e39b4db531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.700446 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.715438 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6f2ab03dadfd1d82e8ad1c3d1a7eb125343b053e0cfcda59cd308e9e70611d9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6b6a60b134c434c010775afaf60a88649c4e494efb81b23f9066c964f41f22b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:24:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.733602 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T09:24:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:46 crc kubenswrapper[4835]: I0319 09:25:46.750462 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51e488e8-eec3-45d8-b590-6ca04d9753c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:23:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T09:22:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d768283c8f73e9c949813ba1a7bca13dcc619b413a1e21ec6328fb3e34be8c0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8198f822fe1a49d71f653e46afde1be6de32846204199d14a1211c44bfe1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26a71640d37ede9ec29825c63e78597c050f401286f28ea8a65a9bce2648dfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:22:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bab4d0b73b0c41b76ac6a88912497076d2461a2459812fad2cff346465ecc601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T09:22:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T09:22:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T09:22:26Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:46Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.401014 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.401053 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.401246 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.401515 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.777802 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.777865 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.777882 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.777905 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.777922 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:47Z","lastTransitionTime":"2026-03-19T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.799055 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.803978 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.804044 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.804068 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.804096 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.804118 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:47Z","lastTransitionTime":"2026-03-19T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.824938 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.829778 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.829829 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.829846 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.829870 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.829886 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:47Z","lastTransitionTime":"2026-03-19T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.849306 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.854264 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.854327 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.854345 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.854373 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.854391 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:47Z","lastTransitionTime":"2026-03-19T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.874300 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.879815 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.879869 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.879886 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.879922 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:47 crc kubenswrapper[4835]: I0319 09:25:47.879942 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:47Z","lastTransitionTime":"2026-03-19T09:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.902327 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T09:25:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d455f31a-96a2-4159-bc94-bb9403ca471c\\\",\\\"systemUUID\\\":\\\"018fc9bf-6313-48f6-b70c-1716ce86e066\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T09:25:47Z is after 2025-08-24T17:21:41Z" Mar 19 09:25:47 crc kubenswrapper[4835]: E0319 09:25:47.902545 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:25:48 crc kubenswrapper[4835]: I0319 09:25:48.402029 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:48 crc kubenswrapper[4835]: E0319 09:25:48.402205 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:48 crc kubenswrapper[4835]: I0319 09:25:48.402023 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:48 crc kubenswrapper[4835]: E0319 09:25:48.402451 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:49 crc kubenswrapper[4835]: I0319 09:25:49.400923 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:49 crc kubenswrapper[4835]: I0319 09:25:49.400983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:49 crc kubenswrapper[4835]: E0319 09:25:49.401112 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:49 crc kubenswrapper[4835]: E0319 09:25:49.401244 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:50 crc kubenswrapper[4835]: I0319 09:25:50.401268 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:50 crc kubenswrapper[4835]: E0319 09:25:50.401509 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:50 crc kubenswrapper[4835]: I0319 09:25:50.401637 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:50 crc kubenswrapper[4835]: E0319 09:25:50.401941 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:51 crc kubenswrapper[4835]: I0319 09:25:51.401844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:51 crc kubenswrapper[4835]: I0319 09:25:51.401868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:51 crc kubenswrapper[4835]: E0319 09:25:51.402077 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:51 crc kubenswrapper[4835]: E0319 09:25:51.402205 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:51 crc kubenswrapper[4835]: E0319 09:25:51.647128 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:52 crc kubenswrapper[4835]: I0319 09:25:52.401964 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:52 crc kubenswrapper[4835]: I0319 09:25:52.401964 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:52 crc kubenswrapper[4835]: E0319 09:25:52.402202 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:52 crc kubenswrapper[4835]: E0319 09:25:52.402374 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:53 crc kubenswrapper[4835]: I0319 09:25:53.401537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:53 crc kubenswrapper[4835]: I0319 09:25:53.401696 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:53 crc kubenswrapper[4835]: E0319 09:25:53.402018 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:53 crc kubenswrapper[4835]: E0319 09:25:53.402635 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:53 crc kubenswrapper[4835]: I0319 09:25:53.403101 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:25:53 crc kubenswrapper[4835]: E0319 09:25:53.403371 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qk6hn_openshift-ovn-kubernetes(2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" Mar 19 09:25:54 crc kubenswrapper[4835]: I0319 09:25:54.401565 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:54 crc kubenswrapper[4835]: I0319 09:25:54.401832 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:54 crc kubenswrapper[4835]: E0319 09:25:54.402923 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:54 crc kubenswrapper[4835]: E0319 09:25:54.403114 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:55 crc kubenswrapper[4835]: I0319 09:25:55.401931 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:55 crc kubenswrapper[4835]: I0319 09:25:55.402024 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:55 crc kubenswrapper[4835]: E0319 09:25:55.402111 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:55 crc kubenswrapper[4835]: E0319 09:25:55.402236 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.401194 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.401273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:56 crc kubenswrapper[4835]: E0319 09:25:56.401398 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:56 crc kubenswrapper[4835]: E0319 09:25:56.401541 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.453591 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ppv6m" podStartSLOduration=158.453563541 podStartE2EDuration="2m38.453563541s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.430703215 +0000 UTC m=+211.279301842" watchObservedRunningTime="2026-03-19 09:25:56.453563541 +0000 UTC m=+211.302162158" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.472017 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fg29g" podStartSLOduration=158.471993024 podStartE2EDuration="2m38.471993024s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.470979396 +0000 UTC m=+211.319578073" watchObservedRunningTime="2026-03-19 09:25:56.471993024 +0000 UTC m=+211.320591651" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.501586 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lkntj" podStartSLOduration=158.501557377 podStartE2EDuration="2m38.501557377s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.500645412 +0000 UTC m=+211.349244009" watchObservedRunningTime="2026-03-19 09:25:56.501557377 +0000 UTC m=+211.350156004" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.545582 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=70.545554421 podStartE2EDuration="1m10.545554421s" podCreationTimestamp="2026-03-19 09:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.543913935 +0000 UTC m=+211.392512562" watchObservedRunningTime="2026-03-19 09:25:56.545554421 +0000 UTC m=+211.394153048" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.644160 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=105.644141765 podStartE2EDuration="1m45.644141765s" podCreationTimestamp="2026-03-19 09:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.643912728 +0000 UTC m=+211.492511395" watchObservedRunningTime="2026-03-19 09:25:56.644141765 +0000 UTC m=+211.492740362" Mar 19 09:25:56 crc kubenswrapper[4835]: E0319 09:25:56.648185 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.675452 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=107.675432445 podStartE2EDuration="1m47.675432445s" podCreationTimestamp="2026-03-19 09:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.675192439 +0000 UTC m=+211.523791066" watchObservedRunningTime="2026-03-19 09:25:56.675432445 +0000 UTC m=+211.524031042" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.705998 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4qwq" podStartSLOduration=157.705972925 podStartE2EDuration="2m37.705972925s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.692394907 +0000 UTC m=+211.540993504" watchObservedRunningTime="2026-03-19 09:25:56.705972925 +0000 UTC m=+211.554571532" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.718328 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podStartSLOduration=158.718298419 podStartE2EDuration="2m38.718298419s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.717763293 +0000 UTC m=+211.566361890" watchObservedRunningTime="2026-03-19 09:25:56.718298419 +0000 UTC m=+211.566897056" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.770804 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jl5x4" podStartSLOduration=158.770785419 podStartE2EDuration="2m38.770785419s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.74170775 +0000 UTC m=+211.590306347" watchObservedRunningTime="2026-03-19 09:25:56.770785419 +0000 UTC m=+211.619384006" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.784803 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=71.784705706 podStartE2EDuration="1m11.784705706s" podCreationTimestamp="2026-03-19 09:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.783555825 +0000 UTC m=+211.632154422" watchObservedRunningTime="2026-03-19 09:25:56.784705706 +0000 UTC m=+211.633304313" Mar 19 09:25:56 crc kubenswrapper[4835]: I0319 09:25:56.799988 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=84.799968351 podStartE2EDuration="1m24.799968351s" podCreationTimestamp="2026-03-19 09:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:56.799454806 +0000 UTC m=+211.648053393" watchObservedRunningTime="2026-03-19 09:25:56.799968351 +0000 UTC m=+211.648566938" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.400943 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.401094 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:57 crc kubenswrapper[4835]: E0319 09:25:57.401179 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:57 crc kubenswrapper[4835]: E0319 09:25:57.401309 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.911289 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.911334 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.911346 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.911365 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.911379 4835 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T09:25:57Z","lastTransitionTime":"2026-03-19T09:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.964946 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc"] Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.965351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.967025 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.968129 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.968453 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:25:57 crc kubenswrapper[4835]: I0319 09:25:57.968130 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.083614 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.084024 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.084129 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92504380-cedb-436f-b211-c464a0fef5db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.084165 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92504380-cedb-436f-b211-c464a0fef5db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.084194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/92504380-cedb-436f-b211-c464a0fef5db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185519 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92504380-cedb-436f-b211-c464a0fef5db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185599 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92504380-cedb-436f-b211-c464a0fef5db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/92504380-cedb-436f-b211-c464a0fef5db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185786 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.185882 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.186090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/92504380-cedb-436f-b211-c464a0fef5db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.187301 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92504380-cedb-436f-b211-c464a0fef5db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.195884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92504380-cedb-436f-b211-c464a0fef5db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.205467 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/92504380-cedb-436f-b211-c464a0fef5db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zk2hc\" (UID: \"92504380-cedb-436f-b211-c464a0fef5db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.291466 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" Mar 19 09:25:58 crc kubenswrapper[4835]: W0319 09:25:58.309592 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92504380_cedb_436f_b211_c464a0fef5db.slice/crio-63039d20fbef2482238acc5d0805378403ab13ead52b15d12b6fd9a01df9fe38 WatchSource:0}: Error finding container 63039d20fbef2482238acc5d0805378403ab13ead52b15d12b6fd9a01df9fe38: Status 404 returned error can't find the container with id 63039d20fbef2482238acc5d0805378403ab13ead52b15d12b6fd9a01df9fe38 Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.401572 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.401724 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:25:58 crc kubenswrapper[4835]: E0319 09:25:58.401797 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:25:58 crc kubenswrapper[4835]: E0319 09:25:58.401865 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.754538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" event={"ID":"92504380-cedb-436f-b211-c464a0fef5db","Type":"ContainerStarted","Data":"6d1ffe3bfe0a85c0336969ff66e9253b6d02a0196ac85a8491a68718e601c0a3"} Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.754600 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" event={"ID":"92504380-cedb-436f-b211-c464a0fef5db","Type":"ContainerStarted","Data":"63039d20fbef2482238acc5d0805378403ab13ead52b15d12b6fd9a01df9fe38"} Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.775057 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zk2hc" podStartSLOduration=160.775037615 podStartE2EDuration="2m40.775037615s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:25:58.774087738 +0000 UTC m=+213.622686355" watchObservedRunningTime="2026-03-19 09:25:58.775037615 +0000 UTC m=+213.623636212" Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.869212 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 09:25:58 crc kubenswrapper[4835]: I0319 09:25:58.880659 4835 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 09:25:59 crc kubenswrapper[4835]: I0319 09:25:59.401673 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:25:59 crc kubenswrapper[4835]: I0319 09:25:59.402931 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:25:59 crc kubenswrapper[4835]: E0319 09:25:59.403281 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:25:59 crc kubenswrapper[4835]: E0319 09:25:59.403621 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:00 crc kubenswrapper[4835]: I0319 09:26:00.400849 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:00 crc kubenswrapper[4835]: E0319 09:26:00.401179 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:00 crc kubenswrapper[4835]: I0319 09:26:00.400958 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:00 crc kubenswrapper[4835]: E0319 09:26:00.401381 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:01 crc kubenswrapper[4835]: I0319 09:26:01.400975 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:01 crc kubenswrapper[4835]: I0319 09:26:01.400976 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:01 crc kubenswrapper[4835]: E0319 09:26:01.401117 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:01 crc kubenswrapper[4835]: E0319 09:26:01.401256 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:01 crc kubenswrapper[4835]: E0319 09:26:01.650189 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:26:02 crc kubenswrapper[4835]: I0319 09:26:02.401720 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:02 crc kubenswrapper[4835]: I0319 09:26:02.401873 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:02 crc kubenswrapper[4835]: E0319 09:26:02.403056 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:02 crc kubenswrapper[4835]: E0319 09:26:02.403304 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:03 crc kubenswrapper[4835]: I0319 09:26:03.401645 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:03 crc kubenswrapper[4835]: I0319 09:26:03.401680 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:03 crc kubenswrapper[4835]: E0319 09:26:03.401861 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:03 crc kubenswrapper[4835]: E0319 09:26:03.401988 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.400982 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.401097 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:04 crc kubenswrapper[4835]: E0319 09:26:04.401187 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:04 crc kubenswrapper[4835]: E0319 09:26:04.401292 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.778625 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/1.log" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.779372 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/0.log" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.779439 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ee35aaa-2819-432a-af95-f1078ad836fc" containerID="db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124" exitCode=1 Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.779480 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerDied","Data":"db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124"} Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.779525 4835 scope.go:117] "RemoveContainer" containerID="9e407e817466bbf7af02e2f32a66e04b607dd9c2744426709f93b14f29f8e24e" Mar 19 09:26:04 crc kubenswrapper[4835]: I0319 09:26:04.780721 4835 scope.go:117] "RemoveContainer" containerID="db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124" Mar 19 09:26:04 crc kubenswrapper[4835]: E0319 09:26:04.781366 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jl5x4_openshift-multus(4ee35aaa-2819-432a-af95-f1078ad836fc)\"" pod="openshift-multus/multus-jl5x4" podUID="4ee35aaa-2819-432a-af95-f1078ad836fc" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.401429 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.401479 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:05 crc kubenswrapper[4835]: E0319 09:26:05.402102 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:05 crc kubenswrapper[4835]: E0319 09:26:05.402252 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.402654 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.784343 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/1.log" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.787692 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/3.log" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.791270 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerStarted","Data":"f7beea1efc6f7ae2f60e690b578a0dc2d1eeda3a71590efa41f1df002699cf3c"} Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.791624 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:26:05 crc kubenswrapper[4835]: I0319 09:26:05.831810 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podStartSLOduration=166.831791844 podStartE2EDuration="2m46.831791844s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:05.830911649 +0000 UTC m=+220.679510246" watchObservedRunningTime="2026-03-19 09:26:05.831791844 +0000 UTC m=+220.680390431" Mar 19 09:26:06 crc kubenswrapper[4835]: I0319 09:26:06.277846 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vs6hx"] Mar 19 09:26:06 crc kubenswrapper[4835]: I0319 09:26:06.277945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:06 crc kubenswrapper[4835]: E0319 09:26:06.278034 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:06 crc kubenswrapper[4835]: I0319 09:26:06.401733 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:06 crc kubenswrapper[4835]: E0319 09:26:06.404247 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:06 crc kubenswrapper[4835]: E0319 09:26:06.650840 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:26:07 crc kubenswrapper[4835]: I0319 09:26:07.401131 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:07 crc kubenswrapper[4835]: I0319 09:26:07.401172 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:07 crc kubenswrapper[4835]: E0319 09:26:07.401318 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:07 crc kubenswrapper[4835]: E0319 09:26:07.401663 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.401069 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.401122 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.401290 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.401420 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.703801 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.703966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.704014 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704052 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:28:10.704029474 +0000 UTC m=+345.552628061 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704070 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.704092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704150 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:10.704128877 +0000 UTC m=+345.552727504 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704158 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: I0319 09:26:08.704171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704232 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:10.704211409 +0000 UTC m=+345.552810036 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704271 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704283 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704295 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704329 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:10.704321552 +0000 UTC m=+345.552920129 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704441 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704492 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704514 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:26:08 crc kubenswrapper[4835]: E0319 09:26:08.704603 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:28:10.704577419 +0000 UTC m=+345.553176046 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 09:26:09 crc kubenswrapper[4835]: I0319 09:26:09.401407 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:09 crc kubenswrapper[4835]: E0319 09:26:09.401734 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:09 crc kubenswrapper[4835]: I0319 09:26:09.401422 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:09 crc kubenswrapper[4835]: E0319 09:26:09.402070 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:10 crc kubenswrapper[4835]: I0319 09:26:10.401868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:10 crc kubenswrapper[4835]: E0319 09:26:10.402072 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:10 crc kubenswrapper[4835]: I0319 09:26:10.402129 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:10 crc kubenswrapper[4835]: E0319 09:26:10.402310 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:11 crc kubenswrapper[4835]: I0319 09:26:11.401796 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:11 crc kubenswrapper[4835]: I0319 09:26:11.401800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:11 crc kubenswrapper[4835]: E0319 09:26:11.402007 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:11 crc kubenswrapper[4835]: E0319 09:26:11.402103 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:11 crc kubenswrapper[4835]: E0319 09:26:11.652080 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:26:12 crc kubenswrapper[4835]: I0319 09:26:12.401512 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:12 crc kubenswrapper[4835]: I0319 09:26:12.401622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:12 crc kubenswrapper[4835]: E0319 09:26:12.401708 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:12 crc kubenswrapper[4835]: E0319 09:26:12.402035 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:13 crc kubenswrapper[4835]: I0319 09:26:13.401889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:13 crc kubenswrapper[4835]: E0319 09:26:13.402035 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:13 crc kubenswrapper[4835]: I0319 09:26:13.401888 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:13 crc kubenswrapper[4835]: E0319 09:26:13.402311 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:14 crc kubenswrapper[4835]: I0319 09:26:14.401426 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:14 crc kubenswrapper[4835]: I0319 09:26:14.401454 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:14 crc kubenswrapper[4835]: E0319 09:26:14.401619 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:14 crc kubenswrapper[4835]: E0319 09:26:14.401811 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:15 crc kubenswrapper[4835]: I0319 09:26:15.527567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:15 crc kubenswrapper[4835]: I0319 09:26:15.527642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:15 crc kubenswrapper[4835]: I0319 09:26:15.527594 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:15 crc kubenswrapper[4835]: E0319 09:26:15.527783 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:15 crc kubenswrapper[4835]: E0319 09:26:15.527925 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:15 crc kubenswrapper[4835]: I0319 09:26:15.528011 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:15 crc kubenswrapper[4835]: E0319 09:26:15.528152 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:15 crc kubenswrapper[4835]: E0319 09:26:15.528307 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:16 crc kubenswrapper[4835]: E0319 09:26:16.652962 4835 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:26:17 crc kubenswrapper[4835]: I0319 09:26:17.401053 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:17 crc kubenswrapper[4835]: I0319 09:26:17.401051 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:17 crc kubenswrapper[4835]: E0319 09:26:17.401271 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:17 crc kubenswrapper[4835]: I0319 09:26:17.401093 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:17 crc kubenswrapper[4835]: I0319 09:26:17.401084 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:17 crc kubenswrapper[4835]: E0319 09:26:17.401430 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:17 crc kubenswrapper[4835]: E0319 09:26:17.401561 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:17 crc kubenswrapper[4835]: E0319 09:26:17.401730 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:18 crc kubenswrapper[4835]: I0319 09:26:18.402024 4835 scope.go:117] "RemoveContainer" containerID="db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124" Mar 19 09:26:18 crc kubenswrapper[4835]: I0319 09:26:18.839192 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/1.log" Mar 19 09:26:18 crc kubenswrapper[4835]: I0319 09:26:18.839537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerStarted","Data":"e9fa86028b1f00abb5fe0bde1d574b1a2d6aae726bddff744f4b1ac500cc935d"} Mar 19 09:26:19 crc kubenswrapper[4835]: I0319 09:26:19.401776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:19 crc kubenswrapper[4835]: I0319 09:26:19.401819 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:19 crc kubenswrapper[4835]: I0319 09:26:19.401843 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:19 crc kubenswrapper[4835]: I0319 09:26:19.401894 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:19 crc kubenswrapper[4835]: E0319 09:26:19.402021 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:19 crc kubenswrapper[4835]: E0319 09:26:19.402281 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:19 crc kubenswrapper[4835]: E0319 09:26:19.402306 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:19 crc kubenswrapper[4835]: E0319 09:26:19.402384 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:21 crc kubenswrapper[4835]: I0319 09:26:21.401535 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:21 crc kubenswrapper[4835]: I0319 09:26:21.401595 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:21 crc kubenswrapper[4835]: I0319 09:26:21.401650 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:21 crc kubenswrapper[4835]: I0319 09:26:21.401558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:21 crc kubenswrapper[4835]: E0319 09:26:21.401720 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:26:21 crc kubenswrapper[4835]: E0319 09:26:21.401910 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:26:21 crc kubenswrapper[4835]: E0319 09:26:21.402104 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vs6hx" podUID="7f0101ce-52a3-4e5b-8fcd-c19020fb071a" Mar 19 09:26:21 crc kubenswrapper[4835]: E0319 09:26:21.402224 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.401827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.401885 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.401887 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.402080 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.405686 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.406097 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.406207 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.407997 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.408940 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:26:23 crc kubenswrapper[4835]: I0319 09:26:23.409986 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 09:26:26 crc kubenswrapper[4835]: I0319 09:26:26.766797 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:26 crc kubenswrapper[4835]: I0319 09:26:26.769360 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:26:26 crc kubenswrapper[4835]: I0319 09:26:26.785687 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f0101ce-52a3-4e5b-8fcd-c19020fb071a-metrics-certs\") pod \"network-metrics-daemon-vs6hx\" (UID: \"7f0101ce-52a3-4e5b-8fcd-c19020fb071a\") " pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.047612 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.055687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vs6hx" Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.312167 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vs6hx"] Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.872665 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" event={"ID":"7f0101ce-52a3-4e5b-8fcd-c19020fb071a","Type":"ContainerStarted","Data":"53544c4daee20519d4e957751429cf02a8213ecf75d1f92546edad05b1ec15f3"} Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.872990 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" event={"ID":"7f0101ce-52a3-4e5b-8fcd-c19020fb071a","Type":"ContainerStarted","Data":"0ada677d30e042fa8c9b862a76ff83694e916b64407efcca72352f32c8bf0c8c"} Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.873004 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vs6hx" event={"ID":"7f0101ce-52a3-4e5b-8fcd-c19020fb071a","Type":"ContainerStarted","Data":"bf043e6a2699c60c393e92b9c694fd97fb93a76d71b2e819fc162e76d51f1882"} Mar 19 09:26:27 crc kubenswrapper[4835]: I0319 09:26:27.885002 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vs6hx" podStartSLOduration=189.884987605 podStartE2EDuration="3m9.884987605s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:27.884409449 +0000 UTC m=+242.733008086" watchObservedRunningTime="2026-03-19 09:26:27.884987605 +0000 UTC m=+242.733586202" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.434259 4835 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.510121 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.511202 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.512618 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-djtzl"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.513273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.515087 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.515138 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg2zr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.516266 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.517490 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.518393 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.520009 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.520220 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.520636 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.520884 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.522000 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.522272 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.523655 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.525036 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.525774 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.527075 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.527883 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.533847 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.533991 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.534538 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.546480 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.546846 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547101 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547149 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547180 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547552 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547179 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547958 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.547191 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548502 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548630 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548635 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548663 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548859 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548936 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.548997 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549080 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549132 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549317 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549394 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549494 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.549620 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.552061 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.552338 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.553455 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.555625 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw4cg"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.572663 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.574831 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.579557 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.594540 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.595081 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.595585 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.603300 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svxz5"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.603722 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.604081 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.604151 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.604437 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.604701 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.605308 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.605575 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.607299 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.607456 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.609823 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.615053 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.615414 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.615734 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.615917 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.616077 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.616303 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4tpxz"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.616774 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.617132 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.617225 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.618302 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.619722 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.619830 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.619958 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620031 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620279 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620334 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620480 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620540 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620628 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620670 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620785 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.620839 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.621187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.621467 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.621918 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.622134 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.626473 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vzxqb"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.627002 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.631391 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.632630 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635378 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635536 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635619 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635786 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635940 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.635976 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.636126 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.636429 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.636464 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.636621 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.636912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.643551 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.644163 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.651384 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.651606 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.653079 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.653404 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.653555 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.654938 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.655079 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.655777 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.658095 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.658293 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.658343 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.658732 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.659196 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.659720 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-mljh4"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.660300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.687389 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.697469 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.709492 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.709844 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.710259 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.711539 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.711783 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.712352 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.712628 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.712875 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713057 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713089 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713091 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.712627 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713313 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-machine-approver-tls\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713347 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713603 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713347 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713849 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.713941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skj6b\" (UniqueName: \"kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.714782 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhc79\" (UniqueName: \"kubernetes.io/projected/b9eb63f6-7fbc-4f82-946b-8844751bb402-kube-api-access-rhc79\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715075 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715211 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715293 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-auth-proxy-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715331 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/232fe661-d645-4591-864d-00acac127390-audit-dir\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715368 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6r2j\" (UniqueName: \"kubernetes.io/projected/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-kube-api-access-s6r2j\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdfs\" (UniqueName: \"kubernetes.io/projected/4e16135b-13e3-4236-9fa7-94ceb99131e2-kube-api-access-vfdfs\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-config\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715473 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715537 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.715805 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.716554 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.716635 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717045 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-serving-cert\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-config\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717183 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717239 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717269 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717311 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-node-pullsecrets\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717351 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717478 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dzx2v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717353 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9eb63f6-7fbc-4f82-946b-8844751bb402-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-encryption-config\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.717736 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.718129 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.718246 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.718317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjl4\" (UniqueName: \"kubernetes.io/projected/31ed1214-265d-4179-b4f4-f77dedb20eb4-kube-api-access-jbjl4\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.718359 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2rzc\" (UniqueName: \"kubernetes.io/projected/232fe661-d645-4591-864d-00acac127390-kube-api-access-n2rzc\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722156 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e16135b-13e3-4236-9fa7-94ceb99131e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722227 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722249 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722307 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722330 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9eb63f6-7fbc-4f82-946b-8844751bb402-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722351 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit-dir\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcldd\" (UniqueName: \"kubernetes.io/projected/7490a09e-a8be-4931-a282-38989ba640b3-kube-api-access-fcldd\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722421 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722443 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-etcd-client\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722499 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cpfq7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722542 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lscm\" (UniqueName: \"kubernetes.io/projected/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-kube-api-access-8lscm\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722569 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722604 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722627 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-image-import-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722645 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-encryption-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-audit-policies\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722767 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722802 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-client\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-images\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722845 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722868 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lfr\" (UniqueName: \"kubernetes.io/projected/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-kube-api-access-46lfr\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722934 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-serving-cert\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.722979 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7490a09e-a8be-4931-a282-38989ba640b3-serving-cert\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.723028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.723053 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvpt\" (UniqueName: \"kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.723066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.723174 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.723639 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.733600 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.734382 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.735214 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565204-mwv2v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.735597 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.739453 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7mr2d"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.739803 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.740066 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpqll"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.740896 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.740859 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.741143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.742503 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.743083 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.743108 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.743614 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.744487 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.744811 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.744977 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.746159 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.746521 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.747956 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.748567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.750470 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.750986 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.752083 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.752672 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.753197 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.754123 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.755209 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.755653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.756799 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.757321 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.759522 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.759521 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565206-wgvsr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.760159 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.760559 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-djtzl"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.762531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.763713 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg2zr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.769586 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw4cg"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.772945 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.774547 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.775016 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.778557 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.780649 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8sbc8"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.782024 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.782406 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.783455 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4tpxz"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.784460 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.786996 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.788035 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.789112 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7mr2d"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.790429 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.791459 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dzx2v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.792563 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vzxqb"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.792675 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.793671 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.794794 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.795799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.796786 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.797847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.798928 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpqll"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.800513 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.801602 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7wwdm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.802372 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.803579 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cl8p7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.804507 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.804628 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svxz5"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.805790 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.806834 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.807896 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.808973 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.810007 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.810983 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565204-mwv2v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.812038 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.813263 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7wwdm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.813894 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.814590 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.815662 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cpfq7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.816897 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.818979 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.820781 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.822484 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cl8p7"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-tmpfs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823528 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-srv-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2rzc\" (UniqueName: \"kubernetes.io/projected/232fe661-d645-4591-864d-00acac127390-kube-api-access-n2rzc\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823647 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bd5062-5030-4847-a7c8-0671269debfe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-default-certificate\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx2j7\" (UniqueName: \"kubernetes.io/projected/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-kube-api-access-bx2j7\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823724 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkbs\" (UniqueName: \"kubernetes.io/projected/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-kube-api-access-fqkbs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e16135b-13e3-4236-9fa7-94ceb99131e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823860 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit-dir\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.823948 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcph\" (UniqueName: \"kubernetes.io/projected/98617442-57ff-4996-b772-1639551ecc89-kube-api-access-9fcph\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit-dir\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824060 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-serving-cert\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-etcd-client\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824166 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-etcd-client\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824184 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-encryption-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824221 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-audit-policies\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824237 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824253 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-images\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abdeec51-5ca9-442c-9937-67dd8f50d88d-serving-cert\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824308 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824323 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824340 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5jq\" (UniqueName: \"kubernetes.io/projected/c3bd5062-5030-4847-a7c8-0671269debfe-kube-api-access-6k5jq\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540880cb-fdb4-4672-81e3-60adfd584bdf-service-ca-bundle\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-serving-cert\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4588x\" (UniqueName: \"kubernetes.io/projected/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-kube-api-access-4588x\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824432 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd152343-688c-493c-8cee-9d498a043182-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824478 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvpt\" (UniqueName: \"kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824513 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-stats-auth\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824567 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/232fe661-d645-4591-864d-00acac127390-audit-dir\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-config\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824611 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824632 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824670 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzdl\" (UniqueName: \"kubernetes.io/projected/d49c138a-fe33-4b41-991a-af15d6bf286d-kube-api-access-cpzdl\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824686 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dzs\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-kube-api-access-g6dzs\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824889 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.824986 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-audit\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.825569 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.826922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-config\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.826994 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd152343-688c-493c-8cee-9d498a043182-config\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827022 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-trusted-ca\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827067 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-service-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-node-pullsecrets\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbxv\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-kube-api-access-gzbxv\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-encryption-config\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.827218 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bd5062-5030-4847-a7c8-0671269debfe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828149 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-serving-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828302 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828330 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565206-wgvsr"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828433 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hzjn2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828603 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/232fe661-d645-4591-864d-00acac127390-audit-dir\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828758 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjl4\" (UniqueName: \"kubernetes.io/projected/31ed1214-265d-4179-b4f4-f77dedb20eb4-kube-api-access-jbjl4\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829042 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-config\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829050 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-profile-collector-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.828446 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/31ed1214-265d-4179-b4f4-f77dedb20eb4-node-pullsecrets\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829215 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7d6w\" (UniqueName: \"kubernetes.io/projected/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-kube-api-access-s7d6w\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829900 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-config\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829523 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-etcd-client\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-audit-policies\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829985 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88gc\" (UniqueName: \"kubernetes.io/projected/c03f3929-621d-4627-9d85-243b6cc02240-kube-api-access-j88gc\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.829546 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830100 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4e16135b-13e3-4236-9fa7-94ceb99131e2-images\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830506 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830534 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830547 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830557 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830617 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830643 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-metrics-tls\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9eb63f6-7fbc-4f82-946b-8844751bb402-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830720 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcldd\" (UniqueName: \"kubernetes.io/projected/7490a09e-a8be-4931-a282-38989ba640b3-kube-api-access-fcldd\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d49c138a-fe33-4b41-991a-af15d6bf286d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830886 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lscm\" (UniqueName: \"kubernetes.io/projected/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-kube-api-access-8lscm\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830903 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-image-import-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830942 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpm5\" (UniqueName: \"kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5\") pod \"auto-csr-approver-29565204-mwv2v\" (UID: \"930d85cd-ba00-4c27-b728-dbdeaab91ca5\") " pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830959 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.830988 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-trusted-ca\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831031 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/4e16135b-13e3-4236-9fa7-94ceb99131e2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831037 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831085 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-client\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831110 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831122 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46lfr\" (UniqueName: \"kubernetes.io/projected/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-kube-api-access-46lfr\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-serving-cert\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831190 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-metrics-certs\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7490a09e-a8be-4931-a282-38989ba640b3-serving-cert\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqjq\" (UniqueName: \"kubernetes.io/projected/80ebd3c0-282e-427f-bedd-f06c26bce30a-kube-api-access-4vqjq\") pod \"downloads-7954f5f757-vzxqb\" (UID: \"80ebd3c0-282e-427f-bedd-f06c26bce30a\") " pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831293 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-machine-approver-tls\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skj6b\" (UniqueName: \"kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831357 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nwgg\" (UniqueName: \"kubernetes.io/projected/540880cb-fdb4-4672-81e3-60adfd584bdf-kube-api-access-4nwgg\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2cr\" (UniqueName: \"kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr\") pod \"auto-csr-approver-29565206-wgvsr\" (UID: \"c232f12f-f539-42a1-9eec-965d59127ca0\") " pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831400 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831652 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthjx\" (UniqueName: \"kubernetes.io/projected/6d8b5f04-3b9c-4656-bf70-203d508949a0-kube-api-access-vthjx\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.831950 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832000 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hzjn2"] Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhc79\" (UniqueName: \"kubernetes.io/projected/b9eb63f6-7fbc-4f82-946b-8844751bb402-kube-api-access-rhc79\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832511 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6r2j\" (UniqueName: \"kubernetes.io/projected/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-kube-api-access-s6r2j\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdfs\" (UniqueName: \"kubernetes.io/projected/4e16135b-13e3-4236-9fa7-94ceb99131e2-kube-api-access-vfdfs\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832577 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd65d\" (UniqueName: \"kubernetes.io/projected/8585eb29-2833-4cd2-900c-83cc7ddee5d1-kube-api-access-zd65d\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-config\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/31ed1214-265d-4179-b4f4-f77dedb20eb4-image-import-ca\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832729 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-auth-proxy-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832795 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.832939 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833048 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833501 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833539 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd152343-688c-493c-8cee-9d498a043182-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833607 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-config\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833632 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-serving-cert\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-config\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2ccd766-0aac-4194-a7cd-447941e75a46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833716 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54b4\" (UniqueName: \"kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833768 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833828 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn79n\" (UniqueName: \"kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94g4\" (UniqueName: \"kubernetes.io/projected/abdeec51-5ca9-442c-9937-67dd8f50d88d-kube-api-access-b94g4\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.833946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-serving-cert\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-service-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9eb63f6-7fbc-4f82-946b-8844751bb402-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834231 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fe661-d645-4591-864d-00acac127390-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834417 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834489 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b9eb63f6-7fbc-4f82-946b-8844751bb402-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834606 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7490a09e-a8be-4931-a282-38989ba640b3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834711 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.834877 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-auth-proxy-config\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.835138 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-encryption-config\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.835573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9eb63f6-7fbc-4f82-946b-8844751bb402-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.835918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7490a09e-a8be-4931-a282-38989ba640b3-serving-cert\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.835999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.836187 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.836373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/31ed1214-265d-4179-b4f4-f77dedb20eb4-etcd-client\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.836656 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-encryption-config\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.836845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.836897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.837235 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-machine-approver-tls\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.838477 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/232fe661-d645-4591-864d-00acac127390-serving-cert\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.853817 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.874216 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.893141 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.918407 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.934084 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.935981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-stats-auth\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936022 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936060 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzdl\" (UniqueName: \"kubernetes.io/projected/d49c138a-fe33-4b41-991a-af15d6bf286d-kube-api-access-cpzdl\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936080 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936097 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dzs\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-kube-api-access-g6dzs\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd152343-688c-493c-8cee-9d498a043182-config\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-trusted-ca\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-service-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936332 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbxv\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-kube-api-access-gzbxv\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bd5062-5030-4847-a7c8-0671269debfe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-config\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j88gc\" (UniqueName: \"kubernetes.io/projected/c03f3929-621d-4627-9d85-243b6cc02240-kube-api-access-j88gc\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.936523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-profile-collector-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7d6w\" (UniqueName: \"kubernetes.io/projected/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-kube-api-access-s7d6w\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937030 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3bd5062-5030-4847-a7c8-0671269debfe-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937084 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937105 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-metrics-tls\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937126 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d49c138a-fe33-4b41-991a-af15d6bf286d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937218 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937298 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpm5\" (UniqueName: \"kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5\") pod \"auto-csr-approver-29565204-mwv2v\" (UID: \"930d85cd-ba00-4c27-b728-dbdeaab91ca5\") " pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937315 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937359 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-trusted-ca\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937377 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937446 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937473 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-metrics-certs\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937491 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937508 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqjq\" (UniqueName: \"kubernetes.io/projected/80ebd3c0-282e-427f-bedd-f06c26bce30a-kube-api-access-4vqjq\") pod \"downloads-7954f5f757-vzxqb\" (UID: \"80ebd3c0-282e-427f-bedd-f06c26bce30a\") " pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthjx\" (UniqueName: \"kubernetes.io/projected/6d8b5f04-3b9c-4656-bf70-203d508949a0-kube-api-access-vthjx\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937631 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nwgg\" (UniqueName: \"kubernetes.io/projected/540880cb-fdb4-4672-81e3-60adfd584bdf-kube-api-access-4nwgg\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2cr\" (UniqueName: \"kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr\") pod \"auto-csr-approver-29565206-wgvsr\" (UID: \"c232f12f-f539-42a1-9eec-965d59127ca0\") " pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937672 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937688 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937777 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd65d\" (UniqueName: \"kubernetes.io/projected/8585eb29-2833-4cd2-900c-83cc7ddee5d1-kube-api-access-zd65d\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937816 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd152343-688c-493c-8cee-9d498a043182-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937850 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-config\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937867 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-config\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937884 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2ccd766-0aac-4194-a7cd-447941e75a46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937902 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c54b4\" (UniqueName: \"kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937923 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn79n\" (UniqueName: \"kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937939 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94g4\" (UniqueName: \"kubernetes.io/projected/abdeec51-5ca9-442c-9937-67dd8f50d88d-kube-api-access-b94g4\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-trusted-ca\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937977 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.937994 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-tmpfs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938010 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-srv-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx2j7\" (UniqueName: \"kubernetes.io/projected/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-kube-api-access-bx2j7\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938044 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkbs\" (UniqueName: \"kubernetes.io/projected/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-kube-api-access-fqkbs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938066 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bd5062-5030-4847-a7c8-0671269debfe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938084 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-default-certificate\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938107 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938125 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcph\" (UniqueName: \"kubernetes.io/projected/98617442-57ff-4996-b772-1639551ecc89-kube-api-access-9fcph\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938161 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-serving-cert\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-etcd-client\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abdeec51-5ca9-442c-9937-67dd8f50d88d-serving-cert\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938255 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5jq\" (UniqueName: \"kubernetes.io/projected/c3bd5062-5030-4847-a7c8-0671269debfe-kube-api-access-6k5jq\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540880cb-fdb4-4672-81e3-60adfd584bdf-service-ca-bundle\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938292 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd152343-688c-493c-8cee-9d498a043182-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-serving-cert\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.938327 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4588x\" (UniqueName: \"kubernetes.io/projected/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-kube-api-access-4588x\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.939549 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.939621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.940164 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abdeec51-5ca9-442c-9937-67dd8f50d88d-config\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.940212 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.940292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/540880cb-fdb4-4672-81e3-60adfd584bdf-service-ca-bundle\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.940686 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.940896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-stats-auth\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.941336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-metrics-tls\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.941437 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-trusted-ca\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.941561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-tmpfs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.942091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-metrics-certs\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.942485 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.943360 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.943779 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/540880cb-fdb4-4672-81e3-60adfd584bdf-default-certificate\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.944392 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3bd5062-5030-4847-a7c8-0671269debfe-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.944402 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-profile-collector-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.944549 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.944989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.945098 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-srv-cert\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.949248 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abdeec51-5ca9-442c-9937-67dd8f50d88d-serving-cert\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.953986 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.978282 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 09:26:28 crc kubenswrapper[4835]: I0319 09:26:28.993708 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.002220 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd152343-688c-493c-8cee-9d498a043182-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.012454 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.017029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd152343-688c-493c-8cee-9d498a043182-config\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.034097 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.053001 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.064071 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-serving-cert\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.072890 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.093157 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.101966 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-config\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.113285 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.133092 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.153628 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.175113 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.182525 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d49c138a-fe33-4b41-991a-af15d6bf286d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.193088 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.239855 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.247649 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-serving-cert\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.254537 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.275015 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.282451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c03f3929-621d-4627-9d85-243b6cc02240-etcd-client\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.294398 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.297029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-config\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.313958 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.333869 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.339159 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.353879 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.357311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c03f3929-621d-4627-9d85-243b6cc02240-etcd-service-ca\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.373758 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.393477 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.403267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.414052 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.433944 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.454022 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.474245 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.495298 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.514049 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.534344 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.553234 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.573627 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.593899 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.614485 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.635005 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.653895 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.674452 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.682117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.694052 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.713942 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.721713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.735197 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.751632 4835 request.go:700] Waited for 1.010238243s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/configmaps?fieldSelector=metadata.name%3Dclient-ca&limit=500&resourceVersion=0 Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.753808 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.761356 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.774437 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.794561 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.813938 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.834808 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.857571 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.874092 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.894374 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.914114 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.934764 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.937195 4835 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.937339 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert podName:0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.437306619 +0000 UTC m=+245.285905236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert") pod "packageserver-d55dfcdfc-6lp28" (UID: "0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939362 4835 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939428 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert podName:6d8b5f04-3b9c-4656-bf70-203d508949a0 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439410487 +0000 UTC m=+245.288009074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert") pod "olm-operator-6b444d44fb-q886g" (UID: "6d8b5f04-3b9c-4656-bf70-203d508949a0") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939454 4835 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939477 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token podName:8585eb29-2833-4cd2-900c-83cc7ddee5d1 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439471819 +0000 UTC m=+245.288070406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token") pod "machine-config-server-8sbc8" (UID: "8585eb29-2833-4cd2-900c-83cc7ddee5d1") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939492 4835 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939522 4835 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939551 4835 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939565 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume podName:98617442-57ff-4996-b772-1639551ecc89 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439543081 +0000 UTC m=+245.288141698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume") pod "dns-default-cl8p7" (UID: "98617442-57ff-4996-b772-1639551ecc89") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939648 4835 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939661 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config podName:f2ccd766-0aac-4194-a7cd-447941e75a46 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439635823 +0000 UTC m=+245.288234480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-56v85" (UID: "f2ccd766-0aac-4194-a7cd-447941e75a46") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939665 4835 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939684 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls podName:98617442-57ff-4996-b772-1639551ecc89 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439674824 +0000 UTC m=+245.288273411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls") pod "dns-default-cl8p7" (UID: "98617442-57ff-4996-b772-1639551ecc89") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939703 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs podName:8585eb29-2833-4cd2-900c-83cc7ddee5d1 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439693445 +0000 UTC m=+245.288292032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs") pod "machine-config-server-8sbc8" (UID: "8585eb29-2833-4cd2-900c-83cc7ddee5d1") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.939721 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert podName:f2ccd766-0aac-4194-a7cd-447941e75a46 nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.439712655 +0000 UTC m=+245.288311362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-56v85" (UID: "f2ccd766-0aac-4194-a7cd-447941e75a46") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.941926 4835 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: E0319 09:26:29.942039 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert podName:0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f nodeName:}" failed. No retries permitted until 2026-03-19 09:26:30.442004649 +0000 UTC m=+245.290603286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert") pod "packageserver-d55dfcdfc-6lp28" (UID: "0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.953707 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 09:26:29 crc kubenswrapper[4835]: I0319 09:26:29.974039 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.006309 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.013641 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.033772 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.053838 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.074851 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.094146 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.114186 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.133932 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.153356 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.174119 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.193704 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.215174 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.234272 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.254227 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.273604 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.294276 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.314080 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.334127 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.353597 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.374145 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.394405 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.415159 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.435364 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.454663 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464096 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464191 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464412 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.464571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.465172 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ccd766-0aac-4194-a7cd-447941e75a46-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.468200 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-webhook-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.469230 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-apiservice-cert\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.470100 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-node-bootstrap-token\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.470453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8585eb29-2833-4cd2-900c-83cc7ddee5d1-certs\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.473542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d8b5f04-3b9c-4656-bf70-203d508949a0-srv-cert\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.473862 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.481079 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ccd766-0aac-4194-a7cd-447941e75a46-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.495030 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.514386 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.534226 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.553328 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.555683 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98617442-57ff-4996-b772-1639551ecc89-config-volume\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.574671 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.578833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98617442-57ff-4996-b772-1639551ecc89-metrics-tls\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.628042 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2rzc\" (UniqueName: \"kubernetes.io/projected/232fe661-d645-4591-864d-00acac127390-kube-api-access-n2rzc\") pod \"apiserver-7bbb656c7d-pww7m\" (UID: \"232fe661-d645-4591-864d-00acac127390\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.647703 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvpt\" (UniqueName: \"kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt\") pod \"controller-manager-879f6c89f-fbsvh\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.667720 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjl4\" (UniqueName: \"kubernetes.io/projected/31ed1214-265d-4179-b4f4-f77dedb20eb4-kube-api-access-jbjl4\") pod \"apiserver-76f77b778f-gg2zr\" (UID: \"31ed1214-265d-4179-b4f4-f77dedb20eb4\") " pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.673891 4835 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.693626 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.714221 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.724195 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.741675 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.747252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcldd\" (UniqueName: \"kubernetes.io/projected/7490a09e-a8be-4931-a282-38989ba640b3-kube-api-access-fcldd\") pod \"authentication-operator-69f744f599-djtzl\" (UID: \"7490a09e-a8be-4931-a282-38989ba640b3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.752478 4835 request.go:700] Waited for 1.920639779s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.758534 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.769113 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lscm\" (UniqueName: \"kubernetes.io/projected/49daa1c0-0b39-4ebf-9e81-38885e24bc4d-kube-api-access-8lscm\") pod \"cluster-samples-operator-665b6dd947-bg8tj\" (UID: \"49daa1c0-0b39-4ebf-9e81-38885e24bc4d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.791324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lfr\" (UniqueName: \"kubernetes.io/projected/1573ddf0-1418-4b22-afd8-eab0c11e0fe0-kube-api-access-46lfr\") pod \"machine-approver-56656f9798-49lsz\" (UID: \"1573ddf0-1418-4b22-afd8-eab0c11e0fe0\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.801012 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.809579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skj6b\" (UniqueName: \"kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b\") pod \"oauth-openshift-558db77b4-svxz5\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.828371 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhc79\" (UniqueName: \"kubernetes.io/projected/b9eb63f6-7fbc-4f82-946b-8844751bb402-kube-api-access-rhc79\") pod \"openshift-config-operator-7777fb866f-vrd5p\" (UID: \"b9eb63f6-7fbc-4f82-946b-8844751bb402\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.844547 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.863297 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.883626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6r2j\" (UniqueName: \"kubernetes.io/projected/00ca197e-7ecb-49f7-b477-f1e5ffaab8c1-kube-api-access-s6r2j\") pod \"openshift-apiserver-operator-796bbdcf4f-w457v\" (UID: \"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.903425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdfs\" (UniqueName: \"kubernetes.io/projected/4e16135b-13e3-4236-9fa7-94ceb99131e2-kube-api-access-vfdfs\") pod \"machine-api-operator-5694c8668f-pw4cg\" (UID: \"4e16135b-13e3-4236-9fa7-94ceb99131e2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.915380 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzdl\" (UniqueName: \"kubernetes.io/projected/d49c138a-fe33-4b41-991a-af15d6bf286d-kube-api-access-cpzdl\") pod \"multus-admission-controller-857f4d67dd-dzx2v\" (UID: \"d49c138a-fe33-4b41-991a-af15d6bf286d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.922783 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dzs\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-kube-api-access-g6dzs\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.937242 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbxv\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-kube-api-access-gzbxv\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.938216 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.951114 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88gc\" (UniqueName: \"kubernetes.io/projected/c03f3929-621d-4627-9d85-243b6cc02240-kube-api-access-j88gc\") pod \"etcd-operator-b45778765-cpfq7\" (UID: \"c03f3929-621d-4627-9d85-243b6cc02240\") " pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.970448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7d6w\" (UniqueName: \"kubernetes.io/projected/4cd6d8ad-9a41-4796-9684-2b6e09675bd9-kube-api-access-s7d6w\") pod \"catalog-operator-68c6474976-9qxb7\" (UID: \"4cd6d8ad-9a41-4796-9684-2b6e09675bd9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.970518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.981169 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.987312 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4588x\" (UniqueName: \"kubernetes.io/projected/5fdabffe-d734-4b9a-8fea-d1608dcef7a2-kube-api-access-4588x\") pod \"control-plane-machine-set-operator-78cbb6b69f-v55z2\" (UID: \"5fdabffe-d734-4b9a-8fea-d1608dcef7a2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:30 crc kubenswrapper[4835]: I0319 09:26:30.987932 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.001811 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gg2zr"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.002068 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.008252 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcph\" (UniqueName: \"kubernetes.io/projected/98617442-57ff-4996-b772-1639551ecc89-kube-api-access-9fcph\") pod \"dns-default-cl8p7\" (UID: \"98617442-57ff-4996-b772-1639551ecc89\") " pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:31 crc kubenswrapper[4835]: W0319 09:26:31.017446 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ed1214_265d_4179_b4f4_f77dedb20eb4.slice/crio-eac4962d86631f820eb1a51eccff9197c5acc2c4eea02123b59af2f3daf62c59 WatchSource:0}: Error finding container eac4962d86631f820eb1a51eccff9197c5acc2c4eea02123b59af2f3daf62c59: Status 404 returned error can't find the container with id eac4962d86631f820eb1a51eccff9197c5acc2c4eea02123b59af2f3daf62c59 Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.029019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpm5\" (UniqueName: \"kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5\") pod \"auto-csr-approver-29565204-mwv2v\" (UID: \"930d85cd-ba00-4c27-b728-dbdeaab91ca5\") " pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.047172 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkbs\" (UniqueName: \"kubernetes.io/projected/0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f-kube-api-access-fqkbs\") pod \"packageserver-d55dfcdfc-6lp28\" (UID: \"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.069697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/884a59ba-e2aa-42a5-87ba-cff1f2c1acea-bound-sa-token\") pod \"ingress-operator-5b745b69d9-cwlsr\" (UID: \"884a59ba-e2aa-42a5-87ba-cff1f2c1acea\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.097130 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqjq\" (UniqueName: \"kubernetes.io/projected/80ebd3c0-282e-427f-bedd-f06c26bce30a-kube-api-access-4vqjq\") pod \"downloads-7954f5f757-vzxqb\" (UID: \"80ebd3c0-282e-427f-bedd-f06c26bce30a\") " pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.107052 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthjx\" (UniqueName: \"kubernetes.io/projected/6d8b5f04-3b9c-4656-bf70-203d508949a0-kube-api-access-vthjx\") pod \"olm-operator-6b444d44fb-q886g\" (UID: \"6d8b5f04-3b9c-4656-bf70-203d508949a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.114328 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.126827 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.131916 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nwgg\" (UniqueName: \"kubernetes.io/projected/540880cb-fdb4-4672-81e3-60adfd584bdf-kube-api-access-4nwgg\") pod \"router-default-5444994796-mljh4\" (UID: \"540880cb-fdb4-4672-81e3-60adfd584bdf\") " pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.133539 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.155513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2cr\" (UniqueName: \"kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr\") pod \"auto-csr-approver-29565206-wgvsr\" (UID: \"c232f12f-f539-42a1-9eec-965d59127ca0\") " pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.156852 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.173675 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.180164 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.189624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd65d\" (UniqueName: \"kubernetes.io/projected/8585eb29-2833-4cd2-900c-83cc7ddee5d1-kube-api-access-zd65d\") pod \"machine-config-server-8sbc8\" (UID: \"8585eb29-2833-4cd2-900c-83cc7ddee5d1\") " pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.194502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee1c4052-bbfe-4c1d-82be-10ca3e2c0212-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nq\" (UID: \"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.197502 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.199013 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8sbc8" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.201531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-djtzl"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.204642 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.208958 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2ccd766-0aac-4194-a7cd-447941e75a46-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-56v85\" (UID: \"f2ccd766-0aac-4194-a7cd-447941e75a46\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.216016 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.219103 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.229066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.240160 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54b4\" (UniqueName: \"kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4\") pod \"route-controller-manager-6576b87f9c-qnlkj\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.245959 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.254777 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx2j7\" (UniqueName: \"kubernetes.io/projected/cc6bb0cd-bbc5-4464-86b7-41ce95891f07-kube-api-access-bx2j7\") pod \"service-ca-operator-777779d784-hm9hx\" (UID: \"cc6bb0cd-bbc5-4464-86b7-41ce95891f07\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.264026 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.267445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.270826 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn79n\" (UniqueName: \"kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n\") pod \"console-f9d7485db-m6s8l\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.280137 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dzx2v"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.297803 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5jq\" (UniqueName: \"kubernetes.io/projected/c3bd5062-5030-4847-a7c8-0671269debfe-kube-api-access-6k5jq\") pod \"openshift-controller-manager-operator-756b6f6bc6-9phkm\" (UID: \"c3bd5062-5030-4847-a7c8-0671269debfe\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.310517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94g4\" (UniqueName: \"kubernetes.io/projected/abdeec51-5ca9-442c-9937-67dd8f50d88d-kube-api-access-b94g4\") pod \"console-operator-58897d9998-4tpxz\" (UID: \"abdeec51-5ca9-442c-9937-67dd8f50d88d\") " pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.313248 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.331721 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.340155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bd152343-688c-493c-8cee-9d498a043182-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gf5f4\" (UID: \"bd152343-688c-493c-8cee-9d498a043182\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.342687 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.368599 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svxz5"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.372224 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378682 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19327bb9-2207-4c03-a828-ed4d3b1df6bb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378715 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-images\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378761 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378785 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378805 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378883 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16938ea3-e31b-4c22-9e31-ca2be501fa58-metrics-tls\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378901 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59da5ba1-62fc-4eec-aeed-45e9c9388496-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378929 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19327bb9-2207-4c03-a828-ed4d3b1df6bb-config\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378949 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378967 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkfn\" (UniqueName: \"kubernetes.io/projected/20ff7170-fb4a-4481-9f62-1e93e95329f5-kube-api-access-vpkfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.378983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.379079 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:31.879064171 +0000 UTC m=+246.727662758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39262985-f63c-4717-ba48-5c374565ce21-cert\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379138 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ff7170-fb4a-4481-9f62-1e93e95329f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379186 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31d3620d-57fe-40e0-b079-f567671475e9-signing-cabundle\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379796 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvrg\" (UniqueName: \"kubernetes.io/projected/7b318ba6-6979-4547-ae35-0e8b29abf6e2-kube-api-access-vmvrg\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379839 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b318ba6-6979-4547-ae35-0e8b29abf6e2-proxy-tls\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ff7170-fb4a-4481-9f62-1e93e95329f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379878 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379895 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59da5ba1-62fc-4eec-aeed-45e9c9388496-proxy-tls\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379911 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwfns\" (UniqueName: \"kubernetes.io/projected/31d3620d-57fe-40e0-b079-f567671475e9-kube-api-access-bwfns\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.379953 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64tvc\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt97w\" (UniqueName: \"kubernetes.io/projected/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-kube-api-access-xt97w\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380052 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31d3620d-57fe-40e0-b079-f567671475e9-signing-key\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380079 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380106 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrgv\" (UniqueName: \"kubernetes.io/projected/39262985-f63c-4717-ba48-5c374565ce21-kube-api-access-vvrgv\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19327bb9-2207-4c03-a828-ed4d3b1df6bb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380199 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmbh7\" (UniqueName: \"kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380240 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ldf\" (UniqueName: \"kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380269 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfc9\" (UniqueName: \"kubernetes.io/projected/59da5ba1-62fc-4eec-aeed-45e9c9388496-kube-api-access-9qfc9\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdrc\" (UniqueName: \"kubernetes.io/projected/52a39d64-5b48-4b1f-b754-cee9a2843333-kube-api-access-pgdrc\") pod \"migrator-59844c95c7-sb8zm\" (UID: \"52a39d64-5b48-4b1f-b754-cee9a2843333\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380328 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2cpz\" (UniqueName: \"kubernetes.io/projected/16938ea3-e31b-4c22-9e31-ca2be501fa58-kube-api-access-k2cpz\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380344 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.380359 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.390559 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.417421 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.462729 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.465544 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-cpfq7"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.480917 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59da5ba1-62fc-4eec-aeed-45e9c9388496-proxy-tls\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwfns\" (UniqueName: \"kubernetes.io/projected/31d3620d-57fe-40e0-b079-f567671475e9-kube-api-access-bwfns\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481176 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-plugins-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481192 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzjw\" (UniqueName: \"kubernetes.io/projected/981b2eee-0205-4314-b31a-d01123049b7b-kube-api-access-4fzjw\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64tvc\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481225 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-registration-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481290 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-mountpoint-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481315 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt97w\" (UniqueName: \"kubernetes.io/projected/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-kube-api-access-xt97w\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481339 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31d3620d-57fe-40e0-b079-f567671475e9-signing-key\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481355 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481389 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481403 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrgv\" (UniqueName: \"kubernetes.io/projected/39262985-f63c-4717-ba48-5c374565ce21-kube-api-access-vvrgv\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19327bb9-2207-4c03-a828-ed4d3b1df6bb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmbh7\" (UniqueName: \"kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ldf\" (UniqueName: \"kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481520 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfc9\" (UniqueName: \"kubernetes.io/projected/59da5ba1-62fc-4eec-aeed-45e9c9388496-kube-api-access-9qfc9\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481535 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdrc\" (UniqueName: \"kubernetes.io/projected/52a39d64-5b48-4b1f-b754-cee9a2843333-kube-api-access-pgdrc\") pod \"migrator-59844c95c7-sb8zm\" (UID: \"52a39d64-5b48-4b1f-b754-cee9a2843333\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2cpz\" (UniqueName: \"kubernetes.io/projected/16938ea3-e31b-4c22-9e31-ca2be501fa58-kube-api-access-k2cpz\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481583 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481610 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481628 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19327bb9-2207-4c03-a828-ed4d3b1df6bb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-images\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481703 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-socket-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481727 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16938ea3-e31b-4c22-9e31-ca2be501fa58-metrics-tls\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481794 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59da5ba1-62fc-4eec-aeed-45e9c9388496-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481855 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19327bb9-2207-4c03-a828-ed4d3b1df6bb-config\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkfn\" (UniqueName: \"kubernetes.io/projected/20ff7170-fb4a-4481-9f62-1e93e95329f5-kube-api-access-vpkfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481964 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39262985-f63c-4717-ba48-5c374565ce21-cert\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.481979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ff7170-fb4a-4481-9f62-1e93e95329f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482032 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31d3620d-57fe-40e0-b079-f567671475e9-signing-cabundle\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482077 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmvrg\" (UniqueName: \"kubernetes.io/projected/7b318ba6-6979-4547-ae35-0e8b29abf6e2-kube-api-access-vmvrg\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482109 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b318ba6-6979-4547-ae35-0e8b29abf6e2-proxy-tls\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482127 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-csi-data-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482143 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ff7170-fb4a-4481-9f62-1e93e95329f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.482170 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.482593 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:31.982570291 +0000 UTC m=+246.831168878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.483951 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.483972 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19327bb9-2207-4c03-a828-ed4d3b1df6bb-config\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.484425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.485357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31d3620d-57fe-40e0-b079-f567671475e9-signing-cabundle\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.487817 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/59da5ba1-62fc-4eec-aeed-45e9c9388496-proxy-tls\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.487832 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.488728 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/59da5ba1-62fc-4eec-aeed-45e9c9388496-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.489783 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.490377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ff7170-fb4a-4481-9f62-1e93e95329f5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.492681 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.492985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493173 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493328 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493379 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493549 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19327bb9-2207-4c03-a828-ed4d3b1df6bb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.493664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b318ba6-6979-4547-ae35-0e8b29abf6e2-images\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.495267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b318ba6-6979-4547-ae35-0e8b29abf6e2-proxy-tls\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.495287 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.495359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39262985-f63c-4717-ba48-5c374565ce21-cert\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.498547 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/16938ea3-e31b-4c22-9e31-ca2be501fa58-metrics-tls\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.499956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31d3620d-57fe-40e0-b079-f567671475e9-signing-key\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.500626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.505235 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ff7170-fb4a-4481-9f62-1e93e95329f5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.510942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.511184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.530595 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64tvc\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.573092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkfn\" (UniqueName: \"kubernetes.io/projected/20ff7170-fb4a-4481-9f62-1e93e95329f5-kube-api-access-vpkfn\") pod \"kube-storage-version-migrator-operator-b67b599dd-cvbbw\" (UID: \"20ff7170-fb4a-4481-9f62-1e93e95329f5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.583931 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-socket-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.583971 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-csi-data-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-plugins-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584119 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzjw\" (UniqueName: \"kubernetes.io/projected/981b2eee-0205-4314-b31a-d01123049b7b-kube-api-access-4fzjw\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-registration-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584152 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-mountpoint-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584248 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-mountpoint-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-plugins-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.584479 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.084461177 +0000 UTC m=+246.933059844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584525 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-registration-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584554 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-csi-data-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.584567 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/981b2eee-0205-4314-b31a-d01123049b7b-socket-dir\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.590125 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwfns\" (UniqueName: \"kubernetes.io/projected/31d3620d-57fe-40e0-b079-f567671475e9-kube-api-access-bwfns\") pod \"service-ca-9c57cc56f-7mr2d\" (UID: \"31d3620d-57fe-40e0-b079-f567671475e9\") " pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.593769 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrgv\" (UniqueName: \"kubernetes.io/projected/39262985-f63c-4717-ba48-5c374565ce21-kube-api-access-vvrgv\") pod \"ingress-canary-7wwdm\" (UID: \"39262985-f63c-4717-ba48-5c374565ce21\") " pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.611615 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmbh7\" (UniqueName: \"kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7\") pod \"marketplace-operator-79b997595-7sdz9\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.627886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmvrg\" (UniqueName: \"kubernetes.io/projected/7b318ba6-6979-4547-ae35-0e8b29abf6e2-kube-api-access-vmvrg\") pod \"machine-config-operator-74547568cd-tl8b7\" (UID: \"7b318ba6-6979-4547-ae35-0e8b29abf6e2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.629326 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pw4cg"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.635112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.651403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt97w\" (UniqueName: \"kubernetes.io/projected/04d6a038-d68f-4f5a-ad4d-95cd686d1d24-kube-api-access-xt97w\") pod \"package-server-manager-789f6589d5-6p9l9\" (UID: \"04d6a038-d68f-4f5a-ad4d-95cd686d1d24\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.686689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.687444 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.687711 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.187690409 +0000 UTC m=+247.036288996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.689975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ldf\" (UniqueName: \"kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf\") pod \"collect-profiles-29565195-xtph2\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.695045 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.697967 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfc9\" (UniqueName: \"kubernetes.io/projected/59da5ba1-62fc-4eec-aeed-45e9c9388496-kube-api-access-9qfc9\") pod \"machine-config-controller-84d6567774-kj82v\" (UID: \"59da5ba1-62fc-4eec-aeed-45e9c9388496\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.713447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdrc\" (UniqueName: \"kubernetes.io/projected/52a39d64-5b48-4b1f-b754-cee9a2843333-kube-api-access-pgdrc\") pod \"migrator-59844c95c7-sb8zm\" (UID: \"52a39d64-5b48-4b1f-b754-cee9a2843333\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.714857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.723644 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.730240 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.740070 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/19327bb9-2207-4c03-a828-ed4d3b1df6bb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ntrvm\" (UID: \"19327bb9-2207-4c03-a828-ed4d3b1df6bb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.740516 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.747875 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2cpz\" (UniqueName: \"kubernetes.io/projected/16938ea3-e31b-4c22-9e31-ca2be501fa58-kube-api-access-k2cpz\") pod \"dns-operator-744455d44c-bpqll\" (UID: \"16938ea3-e31b-4c22-9e31-ca2be501fa58\") " pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.750979 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.755303 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.769607 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.787901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.788146 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.288135455 +0000 UTC m=+247.136734042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.807149 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7wwdm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.821058 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzjw\" (UniqueName: \"kubernetes.io/projected/981b2eee-0205-4314-b31a-d01123049b7b-kube-api-access-4fzjw\") pod \"csi-hostpathplugin-hzjn2\" (UID: \"981b2eee-0205-4314-b31a-d01123049b7b\") " pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.836828 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.864401 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.866906 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565206-wgvsr"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.874141 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.888875 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.889493 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.389478055 +0000 UTC m=+247.238076642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.904518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.919102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" event={"ID":"d49c138a-fe33-4b41-991a-af15d6bf286d","Type":"ContainerStarted","Data":"b8b67896833176b12b3a193a64c2e1bff3f2670400a5abee2f08234028493c1a"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.919139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" event={"ID":"d49c138a-fe33-4b41-991a-af15d6bf286d","Type":"ContainerStarted","Data":"3e607d054041fe18b2a4ac9729d27d59e16ad1172d8d31ff1b298f71799e99b8"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.921400 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mljh4" event={"ID":"540880cb-fdb4-4672-81e3-60adfd584bdf","Type":"ContainerStarted","Data":"8a273fd388c013266f4b23547e3b59acc69f8b5bd2689bf42f5cfecb24529e33"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.922159 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" event={"ID":"232fe661-d645-4591-864d-00acac127390","Type":"ContainerStarted","Data":"b83b320ff060236946cb31e90a75cd88758ebf99442b2f99e8463e379eba410f"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.923024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" event={"ID":"1573ddf0-1418-4b22-afd8-eab0c11e0fe0","Type":"ContainerStarted","Data":"5872390669eb44d02cb34869d5d7ca6a9da1008ea88dfa6f814388a711432921"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.923098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" event={"ID":"1573ddf0-1418-4b22-afd8-eab0c11e0fe0","Type":"ContainerStarted","Data":"8a20b238272411513657af135e6c94b3a61f6c8a9f69b7fa8012b05699dc76c1"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.923998 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" event={"ID":"6d8b5f04-3b9c-4656-bf70-203d508949a0","Type":"ContainerStarted","Data":"31b2eac54a9cdb621f7c10b1a8438684e5d04ba8d67dc1149327c6f3e29bd15e"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.925305 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" event={"ID":"4e16135b-13e3-4236-9fa7-94ceb99131e2","Type":"ContainerStarted","Data":"bff38bc4b3814f5479afe7a78ca6a4a9b5d4dc9f3a4d0bd35be8b5299bb1c0c1"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.926329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" event={"ID":"9f443320-81b9-4f4e-a12b-62023d6074ef","Type":"ContainerStarted","Data":"c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.926349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" event={"ID":"9f443320-81b9-4f4e-a12b-62023d6074ef","Type":"ContainerStarted","Data":"aa33d29c9b78b9fe2f4d8ddd7c9137356ea07124c65c813bc3913738cf564c40"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.927331 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.938824 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" event={"ID":"5fdabffe-d734-4b9a-8fea-d1608dcef7a2","Type":"ContainerStarted","Data":"2fda93c70fb5812978654953f19d932892b2e361ced8d28d837567f2afd97013"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.939825 4835 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fbsvh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.940283 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.942547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" event={"ID":"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f","Type":"ContainerStarted","Data":"48f9742be58b4921fea864b941eb6c6541d4a5448431d69bc42c70f151c0db07"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.947891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" event={"ID":"d111a8cb-8053-4b32-a9d0-8325de3c057f","Type":"ContainerStarted","Data":"7bc97b3569d6412ac3275138bcc0c81b25039672811df1af1fe35da5b0bed00a"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.954890 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" event={"ID":"49daa1c0-0b39-4ebf-9e81-38885e24bc4d","Type":"ContainerStarted","Data":"adcd3e0d0edaa0afd29ade0cc3b9a91edb4177d1e165f3ecf2a952cac987ed64"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.959593 4835 generic.go:334] "Generic (PLEG): container finished" podID="31ed1214-265d-4179-b4f4-f77dedb20eb4" containerID="5ef08324d187d33f3d21ba5849c65dd822b685043db60a26288deb599cb88f87" exitCode=0 Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.959670 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" event={"ID":"31ed1214-265d-4179-b4f4-f77dedb20eb4","Type":"ContainerDied","Data":"5ef08324d187d33f3d21ba5849c65dd822b685043db60a26288deb599cb88f87"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.959700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" event={"ID":"31ed1214-265d-4179-b4f4-f77dedb20eb4","Type":"ContainerStarted","Data":"eac4962d86631f820eb1a51eccff9197c5acc2c4eea02123b59af2f3daf62c59"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.965883 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" event={"ID":"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1","Type":"ContainerStarted","Data":"70b98379f19f6f7a76ff82ea5f738e074044a766154e9dda34b9be226ea06bca"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.968869 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" event={"ID":"7490a09e-a8be-4931-a282-38989ba640b3","Type":"ContainerStarted","Data":"b8708c7ef18e9cd9095e10195cbb1a442053e10631e321cc169d1b7760cf53f5"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.968904 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" event={"ID":"7490a09e-a8be-4931-a282-38989ba640b3","Type":"ContainerStarted","Data":"f80220d579e09b8077ba53986898279955499f25dc3090f3abc01688efd7b131"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.970897 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" event={"ID":"c03f3929-621d-4627-9d85-243b6cc02240","Type":"ContainerStarted","Data":"855c2c5fabf4e9f84d98d163a23b1c7a9f1e6f51207be0b2aa4f4a9972a69dd6"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.976039 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cl8p7"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.976796 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.982019 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85"] Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.982360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8sbc8" event={"ID":"8585eb29-2833-4cd2-900c-83cc7ddee5d1","Type":"ContainerStarted","Data":"d1e9e3dbbbdeadd5f40d2852d668d75366a1fd2e38aa40825fe2ff767400fb4f"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.982401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8sbc8" event={"ID":"8585eb29-2833-4cd2-900c-83cc7ddee5d1","Type":"ContainerStarted","Data":"e4d3c638ec81782519527d87d1dddf2d799fc002f88b9380ab869a5e4d715d6f"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.983525 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" event={"ID":"b9eb63f6-7fbc-4f82-946b-8844751bb402","Type":"ContainerStarted","Data":"9c7fe26fa82dc7baacfacbbd80b64664a050d7312d7bfc673f6cc6582d76116c"} Mar 19 09:26:31 crc kubenswrapper[4835]: I0319 09:26:31.990725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:31 crc kubenswrapper[4835]: E0319 09:26:31.992513 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.492476721 +0000 UTC m=+247.341075308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.000015 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vzxqb"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.009202 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr"] Mar 19 09:26:32 crc kubenswrapper[4835]: W0319 09:26:32.018163 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc232f12f_f539_42a1_9eec_965d59127ca0.slice/crio-bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672 WatchSource:0}: Error finding container bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672: Status 404 returned error can't find the container with id bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672 Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.032050 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.086600 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.091456 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.091995 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.591674572 +0000 UTC m=+247.440273159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.092692 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.093129 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.593117712 +0000 UTC m=+247.441716299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.094767 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.096818 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565204-mwv2v"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.121287 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.197128 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.200009 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.699961535 +0000 UTC m=+247.548560122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.200195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.200584 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.700576432 +0000 UTC m=+247.549175019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.201932 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7mr2d"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.300904 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.301864 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.801846221 +0000 UTC m=+247.650444808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.347717 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.361426 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.403506 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.404170 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:32.904150008 +0000 UTC m=+247.752748685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.416592 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v"] Mar 19 09:26:32 crc kubenswrapper[4835]: W0319 09:26:32.485670 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32cf7e2_4523_41b3_b8a8_7564cca67f4a.slice/crio-7f826b21797f767ea24461f5577dab5a1f8e75b977689b0d8e294d4c8cc575e9 WatchSource:0}: Error finding container 7f826b21797f767ea24461f5577dab5a1f8e75b977689b0d8e294d4c8cc575e9: Status 404 returned error can't find the container with id 7f826b21797f767ea24461f5577dab5a1f8e75b977689b0d8e294d4c8cc575e9 Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.506354 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.506620 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.006594519 +0000 UTC m=+247.855193156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.517372 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.529089 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4tpxz"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.536649 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.563506 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.597312 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bpqll"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.607971 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.608317 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.108302269 +0000 UTC m=+247.956900856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.681385 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.698113 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm"] Mar 19 09:26:32 crc kubenswrapper[4835]: W0319 09:26:32.700518 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b318ba6_6979_4547_ae35_0e8b29abf6e2.slice/crio-498a20dcfd2d0a0c98ff5c535fdf0bb1b89ab28c43c777219df1f9c8ca3a7557 WatchSource:0}: Error finding container 498a20dcfd2d0a0c98ff5c535fdf0bb1b89ab28c43c777219df1f9c8ca3a7557: Status 404 returned error can't find the container with id 498a20dcfd2d0a0c98ff5c535fdf0bb1b89ab28c43c777219df1f9c8ca3a7557 Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.709332 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.709507 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.209482435 +0000 UTC m=+248.058081022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.709580 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.709861 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.209849775 +0000 UTC m=+248.058448362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.777017 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm"] Mar 19 09:26:32 crc kubenswrapper[4835]: W0319 09:26:32.780472 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19327bb9_2207_4c03_a828_ed4d3b1df6bb.slice/crio-260310c1e498c3d21c5bebccbed6efbf2014208db4ea36b35396ac933d3d6560 WatchSource:0}: Error finding container 260310c1e498c3d21c5bebccbed6efbf2014208db4ea36b35396ac933d3d6560: Status 404 returned error can't find the container with id 260310c1e498c3d21c5bebccbed6efbf2014208db4ea36b35396ac933d3d6560 Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.804858 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7wwdm"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.811794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.823479 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.323446687 +0000 UTC m=+248.172045274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.824850 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.825347 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.325339329 +0000 UTC m=+248.173937916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.928581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.930101 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.430052523 +0000 UTC m=+248.278651110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.934799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.937641 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9"] Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.951013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:32 crc kubenswrapper[4835]: E0319 09:26:32.951431 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.451419517 +0000 UTC m=+248.300018094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:32 crc kubenswrapper[4835]: I0319 09:26:32.993801 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hzjn2"] Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.038211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" event={"ID":"cc6bb0cd-bbc5-4464-86b7-41ce95891f07","Type":"ContainerStarted","Data":"5687c114a7e669698562b4d1a7f2fd64cb24f2a1e5ec1c99920557db3a9fbc94"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.038423 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" event={"ID":"cc6bb0cd-bbc5-4464-86b7-41ce95891f07","Type":"ContainerStarted","Data":"de658cebad6ef79730b9027305f81e0144837454096c78f0a3aacec8c5193fd6"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.044060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" event={"ID":"c03f3929-621d-4627-9d85-243b6cc02240","Type":"ContainerStarted","Data":"9e10d6c5543b67ee57690cf0194326d02ba327c03d264684efb2611af991f062"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.053896 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.054285 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.554260499 +0000 UTC m=+248.402859076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.054708 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" event={"ID":"00ca197e-7ecb-49f7-b477-f1e5ffaab8c1","Type":"ContainerStarted","Data":"bf3954e13078b5379a5115ae4a42cb9eb09957e9607e59ab30c7f2157050ad79"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.058174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.058611 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.558598511 +0000 UTC m=+248.407197088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.082874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" event={"ID":"4cd6d8ad-9a41-4796-9684-2b6e09675bd9","Type":"ContainerStarted","Data":"e727cea677ded79327ba3e32f642a9ea78fa299c657b29426c5ee7f852148d47"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.082916 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" event={"ID":"4cd6d8ad-9a41-4796-9684-2b6e09675bd9","Type":"ContainerStarted","Data":"8caf1038168e006b7ecb538ae805b8cab71ef075487c6b932862ae4661140b33"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.083575 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.084489 4835 generic.go:334] "Generic (PLEG): container finished" podID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerID="58d0d8c922003059258d7ddf2711a18c3eced8107a27ae13ebcc235998ae8542" exitCode=0 Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.084686 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" event={"ID":"b9eb63f6-7fbc-4f82-946b-8844751bb402","Type":"ContainerDied","Data":"58d0d8c922003059258d7ddf2711a18c3eced8107a27ae13ebcc235998ae8542"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.086706 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.086843 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.087068 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" event={"ID":"c3bd5062-5030-4847-a7c8-0671269debfe","Type":"ContainerStarted","Data":"8116e86398341cea51f01a8ddcc2c5a0db34378d6fbd0a6c167d7181210f93f4"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.088436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" event={"ID":"f2ccd766-0aac-4194-a7cd-447941e75a46","Type":"ContainerStarted","Data":"3eb26301c801aafe39fea8e9afb6d8ed092b463d45afb98702e7734a20ee854c"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.090436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" event={"ID":"16938ea3-e31b-4c22-9e31-ca2be501fa58","Type":"ContainerStarted","Data":"bf169f63f1f0208135bd6df979a89ec74f7cd147333aa560ee74f2c91ea16d29"} Mar 19 09:26:33 crc kubenswrapper[4835]: W0319 09:26:33.102831 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod981b2eee_0205_4314_b31a_d01123049b7b.slice/crio-2bf8f1c98da4ffbb048014dc4865045618920dd63e4dbee719d8139f4407de79 WatchSource:0}: Error finding container 2bf8f1c98da4ffbb048014dc4865045618920dd63e4dbee719d8139f4407de79: Status 404 returned error can't find the container with id 2bf8f1c98da4ffbb048014dc4865045618920dd63e4dbee719d8139f4407de79 Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.102975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7wwdm" event={"ID":"39262985-f63c-4717-ba48-5c374565ce21","Type":"ContainerStarted","Data":"38df6aa11dac6b6dc0a438549979b0007e8d435c5c0bcf89b8666a8deba4b636"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.115663 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" event={"ID":"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f","Type":"ContainerStarted","Data":"cea0cde6bb3738605e0cfad5c42f31c29bae177ecdce6e83d43a882ffd6dc8b3"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.116429 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.124674 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.124767 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.132842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" event={"ID":"d49c138a-fe33-4b41-991a-af15d6bf286d","Type":"ContainerStarted","Data":"382ce093d7ac9f311c58dd82bd41a5dd4b51ed0439c15d8a533092b72ea6f199"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.140448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cl8p7" event={"ID":"98617442-57ff-4996-b772-1639551ecc89","Type":"ContainerStarted","Data":"41f6fb9021ec6bc48f4f8092de83b9844ff821545851f81a9e0493bd5c0c666e"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.141275 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" event={"ID":"bd152343-688c-493c-8cee-9d498a043182","Type":"ContainerStarted","Data":"74c7eedf2bb55a687c40dc65ccc159361ce61567cf74b22ea37dfd9c2117ab32"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.142438 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" event={"ID":"884a59ba-e2aa-42a5-87ba-cff1f2c1acea","Type":"ContainerStarted","Data":"82af81e504c0ace5dbf9182e40edb9946662490ef25ddf9f3946c70fe25d4414"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.146110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" event={"ID":"4e16135b-13e3-4236-9fa7-94ceb99131e2","Type":"ContainerStarted","Data":"5eb2664f5789e2a028039ebb44d5e2ef8138006718a87ba7c39cfcd3f7584f91"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.151513 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" event={"ID":"c232f12f-f539-42a1-9eec-965d59127ca0","Type":"ContainerStarted","Data":"bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.159049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" event={"ID":"31d3620d-57fe-40e0-b079-f567671475e9","Type":"ContainerStarted","Data":"508eca16627e0f7331b26175a3b12c38d1bbcea61ee74ef42ec87e66b20ea55e"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.159163 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.159996 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.659973272 +0000 UTC m=+248.508571849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.161407 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" event={"ID":"19327bb9-2207-4c03-a828-ed4d3b1df6bb","Type":"ContainerStarted","Data":"260310c1e498c3d21c5bebccbed6efbf2014208db4ea36b35396ac933d3d6560"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.175761 4835 generic.go:334] "Generic (PLEG): container finished" podID="232fe661-d645-4591-864d-00acac127390" containerID="ca7b613665fc78a79e736a9fc4275137054832cd684b853db9a5129f797c02fc" exitCode=0 Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.176173 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" event={"ID":"232fe661-d645-4591-864d-00acac127390","Type":"ContainerDied","Data":"ca7b613665fc78a79e736a9fc4275137054832cd684b853db9a5129f797c02fc"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.181695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" event={"ID":"52a39d64-5b48-4b1f-b754-cee9a2843333","Type":"ContainerStarted","Data":"c9df2d6e9fb5079cbc3b3463391000a3a954ab09363a8153955ff802ec5740cf"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.188777 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" event={"ID":"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212","Type":"ContainerStarted","Data":"0266bf645c407b739ff73f96222f8f1fae79a18beea0eef69840e563f776c650"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.189102 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" event={"ID":"ee1c4052-bbfe-4c1d-82be-10ca3e2c0212","Type":"ContainerStarted","Data":"2f7a03124e14f9be963edb0e1d4718a19f78642fd93c944a818616910a8758ef"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.195318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" event={"ID":"5fdabffe-d734-4b9a-8fea-d1608dcef7a2","Type":"ContainerStarted","Data":"848be71432d3a28a8024304cf38c6a8ccfe589fb039565cf0e219d028c9b304a"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.214274 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" event={"ID":"7b318ba6-6979-4547-ae35-0e8b29abf6e2","Type":"ContainerStarted","Data":"498a20dcfd2d0a0c98ff5c535fdf0bb1b89ab28c43c777219df1f9c8ca3a7557"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.237045 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" event={"ID":"59da5ba1-62fc-4eec-aeed-45e9c9388496","Type":"ContainerStarted","Data":"9b4fa0f7dc1a6eb277c16d36c6af6519d37d826e3ba2628fbb82e0de29bf1ab2"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.237983 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" event={"ID":"abdeec51-5ca9-442c-9937-67dd8f50d88d","Type":"ContainerStarted","Data":"94d0659901015f0ea11f5d4cea6a81ae850d06a6ad16de312df29496234f674d"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.252075 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mljh4" event={"ID":"540880cb-fdb4-4672-81e3-60adfd584bdf","Type":"ContainerStarted","Data":"22dcfa0277296c303069569d71e94534ce59e5a90251808424b76582f392c86c"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.256756 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m6s8l" event={"ID":"c32cf7e2-4523-41b3-b8a8-7564cca67f4a","Type":"ContainerStarted","Data":"7f826b21797f767ea24461f5577dab5a1f8e75b977689b0d8e294d4c8cc575e9"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.260342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.262719 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.762705141 +0000 UTC m=+248.611303728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.265387 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" event={"ID":"1573ddf0-1418-4b22-afd8-eab0c11e0fe0","Type":"ContainerStarted","Data":"e32f2bbce8e102f97566da54db2e3a3db0ffc0e730dccdf73b57bdd184a02770"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.281628 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerStarted","Data":"338c991fbd786e08f8ca13ec05f2f3669d3445184d487199af83b366b79660c3"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.297191 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" event={"ID":"d111a8cb-8053-4b32-a9d0-8325de3c057f","Type":"ContainerStarted","Data":"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.298041 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.298879 4835 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-svxz5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.28:6443/healthz\": dial tcp 10.217.0.28:6443: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.298905 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.28:6443/healthz\": dial tcp 10.217.0.28:6443: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.325303 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vzxqb" event={"ID":"80ebd3c0-282e-427f-bedd-f06c26bce30a","Type":"ContainerStarted","Data":"b904b126438ca44c7a7c78a55e4384ddc61fe34e71c47d0a9d04a9360439961d"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.325721 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.336162 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.336213 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.344392 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" event={"ID":"6d8b5f04-3b9c-4656-bf70-203d508949a0","Type":"ContainerStarted","Data":"78e27448d4a70c4f185bc5cafd3d89988a69b92d4092b9eb05218d4062cf594e"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.344691 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.361248 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.361683 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.861646364 +0000 UTC m=+248.710244961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.361934 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.362793 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q886g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.362870 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podUID="6d8b5f04-3b9c-4656-bf70-203d508949a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.364024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" event={"ID":"49daa1c0-0b39-4ebf-9e81-38885e24bc4d","Type":"ContainerStarted","Data":"f5d58db8922ca3e2ae8c693adb9c241bd8f77b73109c68c57b84fa0f1e18ab23"} Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.364214 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.864194515 +0000 UTC m=+248.712793212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.365513 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podStartSLOduration=194.365495981 podStartE2EDuration="3m14.365495981s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.323953294 +0000 UTC m=+248.172551881" watchObservedRunningTime="2026-03-19 09:26:33.365495981 +0000 UTC m=+248.214094568" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.381416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" event={"ID":"20ff7170-fb4a-4481-9f62-1e93e95329f5","Type":"ContainerStarted","Data":"283961e5c31f437d0b3574d4e4591eca0a2b05a9fd1c263329741ab3f4ec04f6"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.389782 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-m6s8l" podStartSLOduration=194.389765036 podStartE2EDuration="3m14.389765036s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.364868513 +0000 UTC m=+248.213467100" watchObservedRunningTime="2026-03-19 09:26:33.389765036 +0000 UTC m=+248.238363623" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.404297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" event={"ID":"c2fb87b5-ea97-4d21-9129-e1cf0db01fea","Type":"ContainerStarted","Data":"885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.404337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" event={"ID":"c2fb87b5-ea97-4d21-9129-e1cf0db01fea","Type":"ContainerStarted","Data":"288e205259e2813619dfc736845bb352ccc6d4e98746d96e128e24462016796b"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.405154 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.406714 4835 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qnlkj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.406752 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.414389 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" event={"ID":"930d85cd-ba00-4c27-b728-dbdeaab91ca5","Type":"ContainerStarted","Data":"a56a19988313ca9fac28e6354facd17b84fe2af0d4c29afa2bc9c9830761332c"} Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.426185 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podStartSLOduration=194.42617301 podStartE2EDuration="3m14.42617301s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.42331079 +0000 UTC m=+248.271909377" watchObservedRunningTime="2026-03-19 09:26:33.42617301 +0000 UTC m=+248.274771587" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.444360 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.463348 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.467444 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:33.967422528 +0000 UTC m=+248.816021115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.553848 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hm9hx" podStartSLOduration=194.553830693 podStartE2EDuration="3m14.553830693s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.511423541 +0000 UTC m=+248.360022128" watchObservedRunningTime="2026-03-19 09:26:33.553830693 +0000 UTC m=+248.402429280" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.570486 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.573023 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.073003885 +0000 UTC m=+248.921602472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.580314 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35712: no serving certificate available for the kubelet" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.593173 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" podStartSLOduration=195.593155797 podStartE2EDuration="3m15.593155797s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.555847928 +0000 UTC m=+248.404446515" watchObservedRunningTime="2026-03-19 09:26:33.593155797 +0000 UTC m=+248.441754384" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.629499 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-cpfq7" podStartSLOduration=194.629480088 podStartE2EDuration="3m14.629480088s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.590930165 +0000 UTC m=+248.439528742" watchObservedRunningTime="2026-03-19 09:26:33.629480088 +0000 UTC m=+248.478078675" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.629968 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dzx2v" podStartSLOduration=194.629962021 podStartE2EDuration="3m14.629962021s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.629041635 +0000 UTC m=+248.477640242" watchObservedRunningTime="2026-03-19 09:26:33.629962021 +0000 UTC m=+248.478560608" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.673167 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.673770 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.173750769 +0000 UTC m=+249.022349356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.676779 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-mljh4" podStartSLOduration=194.676764483 podStartE2EDuration="3m14.676764483s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.674055877 +0000 UTC m=+248.522654464" watchObservedRunningTime="2026-03-19 09:26:33.676764483 +0000 UTC m=+248.525363070" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.700164 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35718: no serving certificate available for the kubelet" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.759885 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-w457v" podStartSLOduration=195.759857345 podStartE2EDuration="3m15.759857345s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.735176158 +0000 UTC m=+248.583774745" watchObservedRunningTime="2026-03-19 09:26:33.759857345 +0000 UTC m=+248.608455952" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.777485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.777792 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.277780244 +0000 UTC m=+249.126378831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.791372 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v55z2" podStartSLOduration=194.791357262 podStartE2EDuration="3m14.791357262s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.791275139 +0000 UTC m=+248.639873726" watchObservedRunningTime="2026-03-19 09:26:33.791357262 +0000 UTC m=+248.639955849" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.875075 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35728: no serving certificate available for the kubelet" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.875651 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8sbc8" podStartSLOduration=5.875639007 podStartE2EDuration="5.875639007s" podCreationTimestamp="2026-03-19 09:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.874671541 +0000 UTC m=+248.723270128" watchObservedRunningTime="2026-03-19 09:26:33.875639007 +0000 UTC m=+248.724237594" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.882153 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.882476 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.382463028 +0000 UTC m=+249.231061605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.936982 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" podStartSLOduration=194.936968714 podStartE2EDuration="3m14.936968714s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.934577787 +0000 UTC m=+248.783176374" watchObservedRunningTime="2026-03-19 09:26:33.936968714 +0000 UTC m=+248.785567301" Mar 19 09:26:33 crc kubenswrapper[4835]: I0319 09:26:33.989580 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:33 crc kubenswrapper[4835]: E0319 09:26:33.989982 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.489966869 +0000 UTC m=+249.338565456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.010946 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nq" podStartSLOduration=195.010928763 podStartE2EDuration="3m15.010928763s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:33.971057242 +0000 UTC m=+248.819655829" watchObservedRunningTime="2026-03-19 09:26:34.010928763 +0000 UTC m=+248.859527340" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.057685 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35744: no serving certificate available for the kubelet" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.059001 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vzxqb" podStartSLOduration=195.05898466 podStartE2EDuration="3m15.05898466s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.058051604 +0000 UTC m=+248.906650191" watchObservedRunningTime="2026-03-19 09:26:34.05898466 +0000 UTC m=+248.907583247" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.091299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.091723 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.591706291 +0000 UTC m=+249.440304878 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.105172 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" podStartSLOduration=195.105155754 podStartE2EDuration="3m15.105155754s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.103129549 +0000 UTC m=+248.951728136" watchObservedRunningTime="2026-03-19 09:26:34.105155754 +0000 UTC m=+248.953754341" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.176999 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35756: no serving certificate available for the kubelet" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.191651 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" podStartSLOduration=196.191630041 podStartE2EDuration="3m16.191630041s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.160570677 +0000 UTC m=+249.009169264" watchObservedRunningTime="2026-03-19 09:26:34.191630041 +0000 UTC m=+249.040228628" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.193059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.193354 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.693344879 +0000 UTC m=+249.541943466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.208927 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.218172 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:34 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:34 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:34 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.218216 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.229613 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podStartSLOduration=195.229594948 podStartE2EDuration="3m15.229594948s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.192215697 +0000 UTC m=+249.040814284" watchObservedRunningTime="2026-03-19 09:26:34.229594948 +0000 UTC m=+249.078193545" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.230880 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" podStartSLOduration=196.230873183 podStartE2EDuration="3m16.230873183s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.229252569 +0000 UTC m=+249.077851156" watchObservedRunningTime="2026-03-19 09:26:34.230873183 +0000 UTC m=+249.079471770" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.258854 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-49lsz" podStartSLOduration=196.258837771 podStartE2EDuration="3m16.258837771s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.257259058 +0000 UTC m=+249.105857645" watchObservedRunningTime="2026-03-19 09:26:34.258837771 +0000 UTC m=+249.107436358" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.291070 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35758: no serving certificate available for the kubelet" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.294498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.294839 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.794816983 +0000 UTC m=+249.643415570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.397091 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35768: no serving certificate available for the kubelet" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.398253 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.398549 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:34.898537829 +0000 UTC m=+249.747136416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.511010 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.511495 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.011477022 +0000 UTC m=+249.860075609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.527399 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35770: no serving certificate available for the kubelet" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.551335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" event={"ID":"31d3620d-57fe-40e0-b079-f567671475e9","Type":"ContainerStarted","Data":"99b3be1ae76c3fdbb719cc3e146882dcc2e9960a82cddc80093eb80c5546c87a"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.586174 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" event={"ID":"59da5ba1-62fc-4eec-aeed-45e9c9388496","Type":"ContainerStarted","Data":"219e98fe8cb023b04c9c82fde22147842ded36c9750b4b67ac8d5043512f0b4b"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.591271 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7mr2d" podStartSLOduration=195.591245972 podStartE2EDuration="3m15.591245972s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.581300425 +0000 UTC m=+249.429899002" watchObservedRunningTime="2026-03-19 09:26:34.591245972 +0000 UTC m=+249.439844559" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.613657 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.614349 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.114338214 +0000 UTC m=+249.962936801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.630064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" event={"ID":"52a39d64-5b48-4b1f-b754-cee9a2843333","Type":"ContainerStarted","Data":"da3f7e9dc32a1050b4fcc724a48accf7a89bfcb48e54dc442edfd31cf29efafa"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.650087 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" event={"ID":"abdeec51-5ca9-442c-9937-67dd8f50d88d","Type":"ContainerStarted","Data":"8fdb826a4067ff1d6a6e33d4a4477be92713e2992ee5b89df4d524cc35f35842"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.650151 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.656084 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" event={"ID":"c3bd5062-5030-4847-a7c8-0671269debfe","Type":"ContainerStarted","Data":"017986f15bdd7c692c2b5fed074b1a22d0f4465441a5d565e797e9183f4ac558"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.663590 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.663652 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.669155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" event={"ID":"04d6a038-d68f-4f5a-ad4d-95cd686d1d24","Type":"ContainerStarted","Data":"519a33ea3671ef1c26714d1fd50cc076434963ba3b4b09dd167397c7c6b8c936"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.669198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" event={"ID":"04d6a038-d68f-4f5a-ad4d-95cd686d1d24","Type":"ContainerStarted","Data":"798c97e9d339ea12c9500c887d9a02641735c3af4f9e19aa07cbe9dbdea54d84"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.676238 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podStartSLOduration=195.676222706 podStartE2EDuration="3m15.676222706s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.675360873 +0000 UTC m=+249.523959460" watchObservedRunningTime="2026-03-19 09:26:34.676222706 +0000 UTC m=+249.524821293" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.699468 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vzxqb" event={"ID":"80ebd3c0-282e-427f-bedd-f06c26bce30a","Type":"ContainerStarted","Data":"a03b5a7255f37de987779822f4de08254d91c2c021ae79fc3aed53494805c065"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.700100 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.700140 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.712344 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9phkm" podStartSLOduration=195.712328681 podStartE2EDuration="3m15.712328681s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.706062157 +0000 UTC m=+249.554660744" watchObservedRunningTime="2026-03-19 09:26:34.712328681 +0000 UTC m=+249.560927268" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.718407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.719634 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.219606484 +0000 UTC m=+250.068205071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.729480 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" event={"ID":"7b318ba6-6979-4547-ae35-0e8b29abf6e2","Type":"ContainerStarted","Data":"50c46214d5ae4c792a64f63b53aa608d5a14239b1bfbce4e214b67355517da76"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.761084 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerStarted","Data":"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.763588 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.773228 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7sdz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.773287 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.777852 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" event={"ID":"31ed1214-265d-4179-b4f4-f77dedb20eb4","Type":"ContainerStarted","Data":"046156f6c5d286c504e89b60d7e156b3861b63af7b8c87a9b9f3d6ba5ee8f115"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.787113 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m6s8l" event={"ID":"c32cf7e2-4523-41b3-b8a8-7564cca67f4a","Type":"ContainerStarted","Data":"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.815823 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" event={"ID":"49daa1c0-0b39-4ebf-9e81-38885e24bc4d","Type":"ContainerStarted","Data":"38873aff2deaaa1d9f72b8f964dd1c06eecd09b98c515a17cbb0e4f0a0b7e890"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.820462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.821405 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" event={"ID":"884a59ba-e2aa-42a5-87ba-cff1f2c1acea","Type":"ContainerStarted","Data":"352f4511033780e9a0473e9e4348d674662ade6066ee6b776069a6159776c273"} Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.821932 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.321920361 +0000 UTC m=+250.170518948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.842446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" event={"ID":"19327bb9-2207-4c03-a828-ed4d3b1df6bb","Type":"ContainerStarted","Data":"06275b5c98c116dda75beadecfd6281d1bdff36b9b56e3630d4f948703ea2da1"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.850979 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" podStartSLOduration=195.85096086 podStartE2EDuration="3m15.85096086s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.796752811 +0000 UTC m=+249.645351398" watchObservedRunningTime="2026-03-19 09:26:34.85096086 +0000 UTC m=+249.699559447" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.855564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" event={"ID":"16938ea3-e31b-4c22-9e31-ca2be501fa58","Type":"ContainerStarted","Data":"be94a8f1f29b60b9cd9e20baa1dd37dcb0fb62077a8e5940c54f86741aa6790d"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.886133 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" podStartSLOduration=195.886115417 podStartE2EDuration="3m15.886115417s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.846180786 +0000 UTC m=+249.694779373" watchObservedRunningTime="2026-03-19 09:26:34.886115417 +0000 UTC m=+249.734714004" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.887794 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ntrvm" podStartSLOduration=195.887781264 podStartE2EDuration="3m15.887781264s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.883970878 +0000 UTC m=+249.732569465" watchObservedRunningTime="2026-03-19 09:26:34.887781264 +0000 UTC m=+249.736379861" Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.891047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" event={"ID":"4e16135b-13e3-4236-9fa7-94ceb99131e2","Type":"ContainerStarted","Data":"4a77adf474f1f8547d41197ea510bd19a363fd8c271bbcb2be03065f8e6b452c"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.916349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" event={"ID":"20ff7170-fb4a-4481-9f62-1e93e95329f5","Type":"ContainerStarted","Data":"723e17a91fcea512b93a3882a7e0972b4e9df0faafb0edf54ad4c6c65950fecb"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.921329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cl8p7" event={"ID":"98617442-57ff-4996-b772-1639551ecc89","Type":"ContainerStarted","Data":"3d004d9cc3c1e56e7c930c17e5f44419bae46a77c90ebe68b0f0d2b1eb9d671a"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.921526 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:34 crc kubenswrapper[4835]: E0319 09:26:34.922696 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.422680675 +0000 UTC m=+250.271279262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.946553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" event={"ID":"bd152343-688c-493c-8cee-9d498a043182","Type":"ContainerStarted","Data":"4a5a39bf252fe4895aa8149e593fd07548c8e3cb39bbc86f3912b4447335f0bc"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.959954 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" event={"ID":"f2ccd766-0aac-4194-a7cd-447941e75a46","Type":"ContainerStarted","Data":"86b9dbd913d0a422182ef55a224cbd10aae9873e2468e54ac18f3f32912f605d"} Mar 19 09:26:34 crc kubenswrapper[4835]: I0319 09:26:34.988074 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7wwdm" event={"ID":"39262985-f63c-4717-ba48-5c374565ce21","Type":"ContainerStarted","Data":"ca124af6489f05d12733105c89490f2d66e4e13d14681912e02a26b7e99c1769"} Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.010759 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" event={"ID":"981b2eee-0205-4314-b31a-d01123049b7b","Type":"ContainerStarted","Data":"2bf8f1c98da4ffbb048014dc4865045618920dd63e4dbee719d8139f4407de79"} Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.012376 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pw4cg" podStartSLOduration=196.01235275 podStartE2EDuration="3m16.01235275s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:34.967724468 +0000 UTC m=+249.816323055" watchObservedRunningTime="2026-03-19 09:26:35.01235275 +0000 UTC m=+249.860951337" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.026862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.027437 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.52742232 +0000 UTC m=+250.376020907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.027848 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" event={"ID":"23206b12-5e7f-4ce8-aa17-7eac028bd9fe","Type":"ContainerStarted","Data":"1dea12df495f57d14942b518f9ebba5090492ec871638a1d3188f7382ae3ed8b"} Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.027884 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" event={"ID":"23206b12-5e7f-4ce8-aa17-7eac028bd9fe","Type":"ContainerStarted","Data":"368b11ff3a430780c4c4acb2ff088fa7f6b524bcd681e77db95b74c684a1ff56"} Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.039334 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.039536 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.041656 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.062875 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.078297 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-56v85" podStartSLOduration=196.078279106 podStartE2EDuration="3m16.078279106s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:35.013914184 +0000 UTC m=+249.862512761" watchObservedRunningTime="2026-03-19 09:26:35.078279106 +0000 UTC m=+249.926877693" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.106345 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gf5f4" podStartSLOduration=196.106325696 podStartE2EDuration="3m16.106325696s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:35.079045567 +0000 UTC m=+249.927644154" watchObservedRunningTime="2026-03-19 09:26:35.106325696 +0000 UTC m=+249.954924283" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.107285 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cvbbw" podStartSLOduration=196.107279942 podStartE2EDuration="3m16.107279942s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:35.101704167 +0000 UTC m=+249.950302754" watchObservedRunningTime="2026-03-19 09:26:35.107279942 +0000 UTC m=+249.955878529" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.130506 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.132988 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.632971097 +0000 UTC m=+250.481569684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.140549 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.180987 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" podStartSLOduration=197.180966622 podStartE2EDuration="3m17.180966622s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:35.132812503 +0000 UTC m=+249.981411090" watchObservedRunningTime="2026-03-19 09:26:35.180966622 +0000 UTC m=+250.029565209" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.208852 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:35 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:35 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:35 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.208901 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.234966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.235237 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.735224213 +0000 UTC m=+250.583822800 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.237105 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35780: no serving certificate available for the kubelet" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.335842 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.336146 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.836114861 +0000 UTC m=+250.684713448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.336796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.337446 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.837433087 +0000 UTC m=+250.686031674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.351909 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7wwdm" podStartSLOduration=7.351889409 podStartE2EDuration="7.351889409s" podCreationTimestamp="2026-03-19 09:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:35.349232845 +0000 UTC m=+250.197831452" watchObservedRunningTime="2026-03-19 09:26:35.351889409 +0000 UTC m=+250.200487996" Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.441256 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.441574 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:35.941560484 +0000 UTC m=+250.790159071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.543326 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.543603 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.043592134 +0000 UTC m=+250.892190721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.648205 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.648544 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.148511724 +0000 UTC m=+250.997110311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.648670 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.648994 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.148982587 +0000 UTC m=+250.997581174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.750494 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.750803 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.25077799 +0000 UTC m=+251.099376577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.852208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.852777 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.352766458 +0000 UTC m=+251.201365045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:35 crc kubenswrapper[4835]: I0319 09:26:35.953421 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:35 crc kubenswrapper[4835]: E0319 09:26:35.953927 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.453912413 +0000 UTC m=+251.302511000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.039706 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" event={"ID":"7b318ba6-6979-4547-ae35-0e8b29abf6e2","Type":"ContainerStarted","Data":"d46fe096c1b64e0ccf004275ce4ea6d6b733dccad7523a2dcc3c77bf8a16e03e"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.044422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" event={"ID":"b9eb63f6-7fbc-4f82-946b-8844751bb402","Type":"ContainerStarted","Data":"db6e8d3cea2b9b2c674821f1f9a16ccc01f1fc43347a9c815341347802094b82"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.044448 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.046016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" event={"ID":"981b2eee-0205-4314-b31a-d01123049b7b","Type":"ContainerStarted","Data":"1ba840d59a37ad9e444335de80931f5e6167adf045431960b802ace18b39e414"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.054575 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.054864 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.554854122 +0000 UTC m=+251.403452709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.065996 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" event={"ID":"04d6a038-d68f-4f5a-ad4d-95cd686d1d24","Type":"ContainerStarted","Data":"7c1a31926de6cf19e222bf12d48f0ab8ea1f4bf0174828fcd640c0b99b66d16f"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.066665 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.076809 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cl8p7" event={"ID":"98617442-57ff-4996-b772-1639551ecc89","Type":"ContainerStarted","Data":"aab5ecc60d0ddbb4d09792c278d2ab26bb4ba9b747eaae3da00b1896bcdef5c4"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.077427 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.088366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" event={"ID":"31ed1214-265d-4179-b4f4-f77dedb20eb4","Type":"ContainerStarted","Data":"e87142baae5d2330a0f1b22767c4d290f01cf36254499a852930a6c0a804a9a1"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.105770 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" event={"ID":"16938ea3-e31b-4c22-9e31-ca2be501fa58","Type":"ContainerStarted","Data":"370de97e92d236e8abadfd5d7c85d50a29924bdac5ab44de6546edfb857ce298"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.118835 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" event={"ID":"59da5ba1-62fc-4eec-aeed-45e9c9388496","Type":"ContainerStarted","Data":"33a32172379b5b426073496f41ec8faaafdd442c25d44c07855d8c5f85fd290b"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.135079 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" event={"ID":"232fe661-d645-4591-864d-00acac127390","Type":"ContainerStarted","Data":"59b04f24376e5febe8bd880d4428ff9ca3438acf1c5c1fe8733fb71c57402591"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.144001 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" event={"ID":"52a39d64-5b48-4b1f-b754-cee9a2843333","Type":"ContainerStarted","Data":"79e686e4feb6204653205ac41953fea270a6100a61294bed752a71f68791d893"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.155700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-cwlsr" event={"ID":"884a59ba-e2aa-42a5-87ba-cff1f2c1acea","Type":"ContainerStarted","Data":"648a264ed0e101a1e56fcabc965353239e679e9bd56e7908a48252948dffd354"} Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.156219 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.157156 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.657130738 +0000 UTC m=+251.505729325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.157704 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7sdz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.157773 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.158151 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.158197 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.160170 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.160216 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.164610 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tl8b7" podStartSLOduration=197.164590745 podStartE2EDuration="3m17.164590745s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.079304022 +0000 UTC m=+250.927902619" watchObservedRunningTime="2026-03-19 09:26:36.164590745 +0000 UTC m=+251.013189332" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.165860 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cl8p7" podStartSLOduration=8.165852731 podStartE2EDuration="8.165852731s" podCreationTimestamp="2026-03-19 09:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.154245048 +0000 UTC m=+251.002843635" watchObservedRunningTime="2026-03-19 09:26:36.165852731 +0000 UTC m=+251.014451318" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.211949 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:36 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:36 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:36 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.212317 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.218357 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podStartSLOduration=198.218341882 podStartE2EDuration="3m18.218341882s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.216184672 +0000 UTC m=+251.064783269" watchObservedRunningTime="2026-03-19 09:26:36.218341882 +0000 UTC m=+251.066940469" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.258677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.263763 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.763734405 +0000 UTC m=+251.612332992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.286674 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" podStartSLOduration=197.286652023 podStartE2EDuration="3m17.286652023s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.285163172 +0000 UTC m=+251.133761759" watchObservedRunningTime="2026-03-19 09:26:36.286652023 +0000 UTC m=+251.135250610" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.360607 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.361039 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.861023673 +0000 UTC m=+251.709622250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.365216 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" podStartSLOduration=197.365201048 podStartE2EDuration="3m17.365201048s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.365172998 +0000 UTC m=+251.213771595" watchObservedRunningTime="2026-03-19 09:26:36.365201048 +0000 UTC m=+251.213799635" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.425183 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.425235 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.462058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.462436 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:36.962418004 +0000 UTC m=+251.811016591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.514288 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" podStartSLOduration=198.514261766 podStartE2EDuration="3m18.514261766s" podCreationTimestamp="2026-03-19 09:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.514230926 +0000 UTC m=+251.362829513" watchObservedRunningTime="2026-03-19 09:26:36.514261766 +0000 UTC m=+251.362860353" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.515167 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kj82v" podStartSLOduration=197.515162482 podStartE2EDuration="3m17.515162482s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.486090982 +0000 UTC m=+251.334689569" watchObservedRunningTime="2026-03-19 09:26:36.515162482 +0000 UTC m=+251.363761069" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.563364 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.563580 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.063550968 +0000 UTC m=+251.912149555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.563708 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.564096 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.064068723 +0000 UTC m=+251.912667310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.636169 4835 ???:1] "http: TLS handshake error from 192.168.126.11:35794: no serving certificate available for the kubelet" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.640449 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bpqll" podStartSLOduration=197.640432697 podStartE2EDuration="3m17.640432697s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.56934152 +0000 UTC m=+251.417940137" watchObservedRunningTime="2026-03-19 09:26:36.640432697 +0000 UTC m=+251.489031284" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.642147 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-sb8zm" podStartSLOduration=197.642141835 podStartE2EDuration="3m17.642141835s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:36.639365228 +0000 UTC m=+251.487963815" watchObservedRunningTime="2026-03-19 09:26:36.642141835 +0000 UTC m=+251.490740422" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.664473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.664885 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.164866758 +0000 UTC m=+252.013465345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.690412 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.697689 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.701635 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.704594 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.768775 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.769002 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.769102 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.769170 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmpts\" (UniqueName: \"kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.769547 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.2695322 +0000 UTC m=+252.118130787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.875433 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.875803 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.875861 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.875917 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmpts\" (UniqueName: \"kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.876562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.876571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.876628 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.376616171 +0000 UTC m=+252.225214758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.877714 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.878645 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.881550 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.908404 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.914109 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.920220 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmpts\" (UniqueName: \"kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts\") pod \"certified-operators-zgq9n\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.980406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.980623 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jz8\" (UniqueName: \"kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.980798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:36 crc kubenswrapper[4835]: I0319 09:26:36.981165 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:36 crc kubenswrapper[4835]: E0319 09:26:36.981684 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.481666884 +0000 UTC m=+252.330265581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.046236 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.055431 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.056321 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.082299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.082611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.082707 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jz8\" (UniqueName: \"kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.082774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.083182 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.583163398 +0000 UTC m=+252.431761985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.084520 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.084669 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.089030 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.125798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jz8\" (UniqueName: \"kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8\") pod \"community-operators-ds4dj\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.183848 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7sdz9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.184073 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.199159 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.199394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.199444 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlpmv\" (UniqueName: \"kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.199483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.200087 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.700072262 +0000 UTC m=+252.548670849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.211888 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:37 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:37 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:37 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.211946 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.226930 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.302177 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.302958 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlpmv\" (UniqueName: \"kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.303022 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.303142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.306151 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.806132673 +0000 UTC m=+252.654731260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.319714 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.320904 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.321471 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.329480 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.344575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlpmv\" (UniqueName: \"kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv\") pod \"certified-operators-tclt5\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.358778 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.409379 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.409436 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4xf\" (UniqueName: \"kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.409467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.409508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.419205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.424577 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:37.924541628 +0000 UTC m=+252.773140215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.519307 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.519528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4xf\" (UniqueName: \"kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.519560 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.519576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.519982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.520049 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.020033835 +0000 UTC m=+252.868632422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.520450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.529427 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.529612 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerName="controller-manager" containerID="cri-o://c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98" gracePeriod=30 Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.548407 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4xf\" (UniqueName: \"kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf\") pod \"community-operators-d6l6z\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.582969 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.583153 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerName="route-controller-manager" containerID="cri-o://885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad" gracePeriod=30 Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.622963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.623283 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.123270849 +0000 UTC m=+252.971869436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.724635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.725590 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.725978 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.225964197 +0000 UTC m=+253.074562784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.766479 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.791491 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2fb87b5_ea97_4d21_9129_e1cf0db01fea.slice/crio-conmon-885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f443320_81b9_4f4e_a12b_62023d6074ef.slice/crio-c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.830036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.830358 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.33031378 +0000 UTC m=+253.178912367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:37 crc kubenswrapper[4835]: I0319 09:26:37.934267 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:37 crc kubenswrapper[4835]: E0319 09:26:37.934600 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.434586022 +0000 UTC m=+253.283184609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.035776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.036045 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.536033776 +0000 UTC m=+253.384632353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.071411 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.137316 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.137473 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.637448238 +0000 UTC m=+253.486046825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.137586 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.137914 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.63790257 +0000 UTC m=+253.486501147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.183086 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.183145 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.214025 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:38 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:38 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:38 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.214068 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.218032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" event={"ID":"981b2eee-0205-4314-b31a-d01123049b7b","Type":"ContainerStarted","Data":"4ffc554d9a3ca18719c73038b4a42cd779f2730f0890c5e08f518fd070d0c499"} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.226043 4835 generic.go:334] "Generic (PLEG): container finished" podID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerID="c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98" exitCode=0 Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.226195 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" event={"ID":"9f443320-81b9-4f4e-a12b-62023d6074ef","Type":"ContainerDied","Data":"c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98"} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.238090 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerStarted","Data":"93cf176b7be290f4be4cc3d398c7a1e06281fd5637a121c6e052025a999f497a"} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.240832 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.240994 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.740969259 +0000 UTC m=+253.589567846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.241098 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.241399 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.74138747 +0000 UTC m=+253.589986047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.242146 4835 generic.go:334] "Generic (PLEG): container finished" podID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerID="885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad" exitCode=0 Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.242207 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" event={"ID":"c2fb87b5-ea97-4d21-9129-e1cf0db01fea","Type":"ContainerDied","Data":"885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad"} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.245834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerStarted","Data":"2b18800640c597eb7ebd2eb578df404e062dd7c5e476a37bd9824fb1ab124317"} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.316535 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.344519 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.345488 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.845463876 +0000 UTC m=+253.694062463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.348637 4835 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.349637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.350409 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.850396814 +0000 UTC m=+253.698995401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.388438 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.389402 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.399236 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450209 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca\") pod \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450258 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca\") pod \"9f443320-81b9-4f4e-a12b-62023d6074ef\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles\") pod \"9f443320-81b9-4f4e-a12b-62023d6074ef\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450319 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c54b4\" (UniqueName: \"kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4\") pod \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450338 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert\") pod \"9f443320-81b9-4f4e-a12b-62023d6074ef\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450448 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert\") pod \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config\") pod \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\" (UID: \"c2fb87b5-ea97-4d21-9129-e1cf0db01fea\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450496 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config\") pod \"9f443320-81b9-4f4e-a12b-62023d6074ef\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.450513 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hvpt\" (UniqueName: \"kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt\") pod \"9f443320-81b9-4f4e-a12b-62023d6074ef\" (UID: \"9f443320-81b9-4f4e-a12b-62023d6074ef\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.451140 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2fb87b5-ea97-4d21-9129-e1cf0db01fea" (UID: "c2fb87b5-ea97-4d21-9129-e1cf0db01fea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.451284 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.451348 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:38.951333133 +0000 UTC m=+253.799931720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.452593 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f443320-81b9-4f4e-a12b-62023d6074ef" (UID: "9f443320-81b9-4f4e-a12b-62023d6074ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.452711 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9f443320-81b9-4f4e-a12b-62023d6074ef" (UID: "9f443320-81b9-4f4e-a12b-62023d6074ef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.453031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config" (OuterVolumeSpecName: "config") pod "9f443320-81b9-4f4e-a12b-62023d6074ef" (UID: "9f443320-81b9-4f4e-a12b-62023d6074ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.453500 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config" (OuterVolumeSpecName: "config") pod "c2fb87b5-ea97-4d21-9129-e1cf0db01fea" (UID: "c2fb87b5-ea97-4d21-9129-e1cf0db01fea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.456324 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2fb87b5-ea97-4d21-9129-e1cf0db01fea" (UID: "c2fb87b5-ea97-4d21-9129-e1cf0db01fea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.456664 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt" (OuterVolumeSpecName: "kube-api-access-8hvpt") pod "9f443320-81b9-4f4e-a12b-62023d6074ef" (UID: "9f443320-81b9-4f4e-a12b-62023d6074ef"). InnerVolumeSpecName "kube-api-access-8hvpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.457146 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4" (OuterVolumeSpecName: "kube-api-access-c54b4") pod "c2fb87b5-ea97-4d21-9129-e1cf0db01fea" (UID: "c2fb87b5-ea97-4d21-9129-e1cf0db01fea"). InnerVolumeSpecName "kube-api-access-c54b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.458509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f443320-81b9-4f4e-a12b-62023d6074ef" (UID: "9f443320-81b9-4f4e-a12b-62023d6074ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.552633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553014 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553032 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553041 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hvpt\" (UniqueName: \"kubernetes.io/projected/9f443320-81b9-4f4e-a12b-62023d6074ef-kube-api-access-8hvpt\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553050 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553059 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f443320-81b9-4f4e-a12b-62023d6074ef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553067 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c54b4\" (UniqueName: \"kubernetes.io/projected/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-kube-api-access-c54b4\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553075 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f443320-81b9-4f4e-a12b-62023d6074ef-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.553084 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2fb87b5-ea97-4d21-9129-e1cf0db01fea-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.553371 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:39.053355652 +0000 UTC m=+253.901954239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.654255 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.654641 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:39.15462266 +0000 UTC m=+254.003221247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.756033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.756409 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:39.256394442 +0000 UTC m=+254.104993039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.857295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.857410 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 09:26:39.357383683 +0000 UTC m=+254.205982270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.857880 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:38 crc kubenswrapper[4835]: E0319 09:26:38.858157 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 09:26:39.358145134 +0000 UTC m=+254.206743721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-46rgq" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.901989 4835 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T09:26:38.348655395Z","Handler":null,"Name":""} Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.912284 4835 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.912315 4835 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.959575 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 09:26:38 crc kubenswrapper[4835]: I0319 09:26:38.963095 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.046094 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:26:39 crc kubenswrapper[4835]: E0319 09:26:39.046331 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerName="route-controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.046347 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerName="route-controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: E0319 09:26:39.046360 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerName="controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.046367 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerName="controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.046498 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" containerName="controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.046520 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" containerName="route-controller-manager" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.047444 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.050306 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.071703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.087811 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.093027 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.093143 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.116491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-46rgq\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.175088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.175153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.175317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt5lg\" (UniqueName: \"kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.208527 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:39 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:39 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:39 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.208595 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.225368 4835 ???:1] "http: TLS handshake error from 192.168.126.11:51324: no serving certificate available for the kubelet" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.256828 4835 generic.go:334] "Generic (PLEG): container finished" podID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerID="9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d" exitCode=0 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.256894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerDied","Data":"9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.261290 4835 generic.go:334] "Generic (PLEG): container finished" podID="db372943-349a-417e-af5e-588162580a5a" containerID="0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390" exitCode=0 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.261373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerDied","Data":"0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.261402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerStarted","Data":"fe74eeba78b0abbd2785ee43305725167126b71d1216e5ee863e1bc0f24acc3e"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.263604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" event={"ID":"981b2eee-0205-4314-b31a-d01123049b7b","Type":"ContainerStarted","Data":"7b7184f3684969072dcdbfbb3ebe09a0852090d7b1dc87dffb7274efc6a58466"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.263636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" event={"ID":"981b2eee-0205-4314-b31a-d01123049b7b","Type":"ContainerStarted","Data":"3ad53a288f682ad1cf1b2b0aca9a45ca86119006c42e0864406b896014cd3e40"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.264434 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.266818 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.267613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fbsvh" event={"ID":"9f443320-81b9-4f4e-a12b-62023d6074ef","Type":"ContainerDied","Data":"aa33d29c9b78b9fe2f4d8ddd7c9137356ea07124c65c813bc3913738cf564c40"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.267645 4835 scope.go:117] "RemoveContainer" containerID="c6bfa0350a2801562aa3eed9f00675b536f1eb5d08aad8fe311625c7f237ff98" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.268692 4835 generic.go:334] "Generic (PLEG): container finished" podID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerID="da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f" exitCode=0 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.268724 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerDied","Data":"da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.270843 4835 generic.go:334] "Generic (PLEG): container finished" podID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerID="8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f" exitCode=0 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.270880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerDied","Data":"8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.270894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerStarted","Data":"5093aa0ef2691d4d4e59acaf51178b54ec7ada7a910e3482902627d95f0ece6a"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.277507 4835 generic.go:334] "Generic (PLEG): container finished" podID="23206b12-5e7f-4ce8-aa17-7eac028bd9fe" containerID="1dea12df495f57d14942b518f9ebba5090492ec871638a1d3188f7382ae3ed8b" exitCode=0 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.277639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" event={"ID":"23206b12-5e7f-4ce8-aa17-7eac028bd9fe","Type":"ContainerDied","Data":"1dea12df495f57d14942b518f9ebba5090492ec871638a1d3188f7382ae3ed8b"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.278083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.278149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.278205 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt5lg\" (UniqueName: \"kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.278484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.278753 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.285549 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.285553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj" event={"ID":"c2fb87b5-ea97-4d21-9129-e1cf0db01fea","Type":"ContainerDied","Data":"288e205259e2813619dfc736845bb352ccc6d4e98746d96e128e24462016796b"} Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.301150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt5lg\" (UniqueName: \"kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg\") pod \"redhat-marketplace-hk7rv\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.310798 4835 scope.go:117] "RemoveContainer" containerID="885d92e7c4db67c9f439ef6214f3cec3535227f755166766189c05614c96a9ad" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.339251 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" podStartSLOduration=11.339232942 podStartE2EDuration="11.339232942s" podCreationTimestamp="2026-03-19 09:26:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:39.334951752 +0000 UTC m=+254.183550339" watchObservedRunningTime="2026-03-19 09:26:39.339232942 +0000 UTC m=+254.187831519" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.384191 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.399481 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.410179 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fbsvh"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.433381 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.438805 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qnlkj"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.447980 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.448970 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.458091 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:26:39 crc kubenswrapper[4835]: W0319 09:26:39.554298 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68957ece_d303_4cd7_9e45_cc960a83a7b0.slice/crio-2b72c853a4cb9cfa6b41621158b5f31b75aeb44a97c003f74c4cd687ce35e8b5 WatchSource:0}: Error finding container 2b72c853a4cb9cfa6b41621158b5f31b75aeb44a97c003f74c4cd687ce35e8b5: Status 404 returned error can't find the container with id 2b72c853a4cb9cfa6b41621158b5f31b75aeb44a97c003f74c4cd687ce35e8b5 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.559303 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.591951 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78fwt\" (UniqueName: \"kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.591993 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.592020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.645973 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:26:39 crc kubenswrapper[4835]: W0319 09:26:39.657321 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5a1f24_78ad_4c43_bf96_47e2e23c1996.slice/crio-60e2f9a16777ca3a25bd9e83840baf6f3ea9cd278c10639b1e1252ea7e65d5a8 WatchSource:0}: Error finding container 60e2f9a16777ca3a25bd9e83840baf6f3ea9cd278c10639b1e1252ea7e65d5a8: Status 404 returned error can't find the container with id 60e2f9a16777ca3a25bd9e83840baf6f3ea9cd278c10639b1e1252ea7e65d5a8 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.693122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78fwt\" (UniqueName: \"kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.693164 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.693192 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.693596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.694859 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.711126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78fwt\" (UniqueName: \"kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt\") pod \"redhat-marketplace-mr45m\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.771991 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.845306 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.846537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.850351 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.856388 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.864729 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.986049 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:26:39 crc kubenswrapper[4835]: W0319 09:26:39.988535 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9aaba1_3f3b_4c79_ba35_14f28fbbf657.slice/crio-fa8adbcb4f64c9a6994d1ae6b3d335b7b24858032a4c1abeac46ecdd3bf23d15 WatchSource:0}: Error finding container fa8adbcb4f64c9a6994d1ae6b3d335b7b24858032a4c1abeac46ecdd3bf23d15: Status 404 returned error can't find the container with id fa8adbcb4f64c9a6994d1ae6b3d335b7b24858032a4c1abeac46ecdd3bf23d15 Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.996411 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.996661 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gmkc\" (UniqueName: \"kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:39 crc kubenswrapper[4835]: I0319 09:26:39.996690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.042917 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.044227 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.068575 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.081525 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.082532 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.084356 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.086419 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.086525 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.086960 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.087029 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.087155 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.087166 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.088151 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.096454 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.097042 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.097884 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.120300 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.120910 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.120940 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gmkc\" (UniqueName: \"kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.120991 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.121046 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.121124 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.121675 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.121692 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.123056 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.125104 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.128358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.153307 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gmkc\" (UniqueName: \"kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc\") pod \"redhat-operators-65z7x\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.181185 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.208605 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:40 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:40 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:40 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.208682 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222095 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222235 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222279 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222334 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222371 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222395 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5jmm\" (UniqueName: \"kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrw6\" (UniqueName: \"kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.222561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bh58\" (UniqueName: \"kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.305020 4835 generic.go:334] "Generic (PLEG): container finished" podID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerID="2f86417ed996ad9036a899c2bfb75ade718562815c409d3ad3d0478a6229ddcc" exitCode=0 Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.305123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerDied","Data":"2f86417ed996ad9036a899c2bfb75ade718562815c409d3ad3d0478a6229ddcc"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.305202 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerStarted","Data":"fa8adbcb4f64c9a6994d1ae6b3d335b7b24858032a4c1abeac46ecdd3bf23d15"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.309547 4835 generic.go:334] "Generic (PLEG): container finished" podID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerID="5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74" exitCode=0 Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.309628 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerDied","Data":"5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.309654 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerStarted","Data":"60e2f9a16777ca3a25bd9e83840baf6f3ea9cd278c10639b1e1252ea7e65d5a8"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323418 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5jmm\" (UniqueName: \"kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrw6\" (UniqueName: \"kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bh58\" (UniqueName: \"kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323544 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323646 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.323676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.324614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.325661 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.326039 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.326638 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.326919 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.326952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" event={"ID":"68957ece-d303-4cd7-9e45-cc960a83a7b0","Type":"ContainerStarted","Data":"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.326979 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" event={"ID":"68957ece-d303-4cd7-9e45-cc960a83a7b0","Type":"ContainerStarted","Data":"2b72c853a4cb9cfa6b41621158b5f31b75aeb44a97c003f74c4cd687ce35e8b5"} Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.327116 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.327228 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.332101 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.336429 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.337447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.341496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5jmm\" (UniqueName: \"kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm\") pod \"redhat-operators-65hbx\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.346517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrw6\" (UniqueName: \"kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6\") pod \"route-controller-manager-55bb997f45-j4kms\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.347519 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bh58\" (UniqueName: \"kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58\") pod \"controller-manager-5b45df9cb7-8wg6b\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.371405 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.376588 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" podStartSLOduration=201.376570039 podStartE2EDuration="3m21.376570039s" podCreationTimestamp="2026-03-19 09:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:40.369241545 +0000 UTC m=+255.217840132" watchObservedRunningTime="2026-03-19 09:26:40.376570039 +0000 UTC m=+255.225168616" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.381760 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.414174 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.414848 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f443320-81b9-4f4e-a12b-62023d6074ef" path="/var/lib/kubelet/pods/9f443320-81b9-4f4e-a12b-62023d6074ef/volumes" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.415386 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb87b5-ea97-4d21-9129-e1cf0db01fea" path="/var/lib/kubelet/pods/c2fb87b5-ea97-4d21-9129-e1cf0db01fea/volumes" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.436245 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.450337 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.584952 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:26:40 crc kubenswrapper[4835]: W0319 09:26:40.603357 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fa0a19_2075_4d87_80cb_5e6c24854e5f.slice/crio-227137ec04f02a194e7741f3ca9b518a76545afcfe38d7bf771c0f04aff08907 WatchSource:0}: Error finding container 227137ec04f02a194e7741f3ca9b518a76545afcfe38d7bf771c0f04aff08907: Status 404 returned error can't find the container with id 227137ec04f02a194e7741f3ca9b518a76545afcfe38d7bf771c0f04aff08907 Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.624441 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.724785 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.725313 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.733145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ldf\" (UniqueName: \"kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf\") pod \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.733329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume\") pod \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.733358 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume\") pod \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\" (UID: \"23206b12-5e7f-4ce8-aa17-7eac028bd9fe\") " Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.738795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "23206b12-5e7f-4ce8-aa17-7eac028bd9fe" (UID: "23206b12-5e7f-4ce8-aa17-7eac028bd9fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.741365 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf" (OuterVolumeSpecName: "kube-api-access-67ldf") pod "23206b12-5e7f-4ce8-aa17-7eac028bd9fe" (UID: "23206b12-5e7f-4ce8-aa17-7eac028bd9fe"). InnerVolumeSpecName "kube-api-access-67ldf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.741468 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23206b12-5e7f-4ce8-aa17-7eac028bd9fe" (UID: "23206b12-5e7f-4ce8-aa17-7eac028bd9fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.742964 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.743159 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.743863 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.759509 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.771495 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.815463 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.837526 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.837557 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:40 crc kubenswrapper[4835]: I0319 09:26:40.837587 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ldf\" (UniqueName: \"kubernetes.io/projected/23206b12-5e7f-4ce8-aa17-7eac028bd9fe-kube-api-access-67ldf\") on node \"crc\" DevicePath \"\"" Mar 19 09:26:40 crc kubenswrapper[4835]: W0319 09:26:40.838145 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854bbde7_af9e_4222_8aef_19f4c684510a.slice/crio-f4c6a3a4f33bd23bc745335320f01397fcf9d7082500ce5cabb27a3e17c315a0 WatchSource:0}: Error finding container f4c6a3a4f33bd23bc745335320f01397fcf9d7082500ce5cabb27a3e17c315a0: Status 404 returned error can't find the container with id f4c6a3a4f33bd23bc745335320f01397fcf9d7082500ce5cabb27a3e17c315a0 Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.176336 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182046 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182086 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182565 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182583 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:41 crc kubenswrapper[4835]: E0319 09:26:41.182676 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23206b12-5e7f-4ce8-aa17-7eac028bd9fe" containerName="collect-profiles" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182692 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="23206b12-5e7f-4ce8-aa17-7eac028bd9fe" containerName="collect-profiles" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.182886 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="23206b12-5e7f-4ce8-aa17-7eac028bd9fe" containerName="collect-profiles" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.183263 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.185206 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.185375 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.207059 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.209976 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:41 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:41 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:41 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.210006 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.234196 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.335335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" event={"ID":"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1","Type":"ContainerStarted","Data":"6a1cdb2db91d182c1de166db4dfcd7dd21b494e44530184a8725ae9ce8393e4d"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.335679 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.335912 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" event={"ID":"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1","Type":"ContainerStarted","Data":"d2b54dfd4fed313442447b7042924a48ccea48c993b24d1f358aec5805ccfd80"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.340799 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.343971 4835 generic.go:334] "Generic (PLEG): container finished" podID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerID="87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b" exitCode=0 Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.344026 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerDied","Data":"87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.344042 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerStarted","Data":"127dd30f2ec3344f70f2c283dd505f0c065a47ff386fe0849f7a19580dadf529"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.346782 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.346833 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.358344 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" podStartSLOduration=3.3583225 podStartE2EDuration="3.3583225s" podCreationTimestamp="2026-03-19 09:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:41.357798005 +0000 UTC m=+256.206396592" watchObservedRunningTime="2026-03-19 09:26:41.3583225 +0000 UTC m=+256.206921087" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.358715 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" event={"ID":"854bbde7-af9e-4222-8aef-19f4c684510a","Type":"ContainerStarted","Data":"0701c836ac2f8ae5b7ede7658193e3172079a5ff756c5f55aa97b208e9a6b3f8"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.358784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" event={"ID":"854bbde7-af9e-4222-8aef-19f4c684510a","Type":"ContainerStarted","Data":"f4c6a3a4f33bd23bc745335320f01397fcf9d7082500ce5cabb27a3e17c315a0"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.359696 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.363527 4835 generic.go:334] "Generic (PLEG): container finished" podID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerID="374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12" exitCode=0 Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.363579 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerDied","Data":"374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.363597 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerStarted","Data":"227137ec04f02a194e7741f3ca9b518a76545afcfe38d7bf771c0f04aff08907"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.390040 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" event={"ID":"23206b12-5e7f-4ce8-aa17-7eac028bd9fe","Type":"ContainerDied","Data":"368b11ff3a430780c4c4acb2ff088fa7f6b524bcd681e77db95b74c684a1ff56"} Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.390082 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="368b11ff3a430780c4c4acb2ff088fa7f6b524bcd681e77db95b74c684a1ff56" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.390150 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.395644 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gg2zr" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.397242 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pww7m" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.444243 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" podStartSLOduration=3.44422123 podStartE2EDuration="3.44422123s" podCreationTimestamp="2026-03-19 09:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:26:41.435545679 +0000 UTC m=+256.284144266" watchObservedRunningTime="2026-03-19 09:26:41.44422123 +0000 UTC m=+256.292819817" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.451601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.451720 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.454895 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.494238 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.494511 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.496359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.505589 4835 patch_prober.go:28] interesting pod/console-f9d7485db-m6s8l container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.505634 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m6s8l" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.539333 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.558820 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.684825 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.686140 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.690656 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.710437 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.739446 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.740209 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.765496 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.765550 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.846147 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.866736 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.866800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.867488 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:41 crc kubenswrapper[4835]: I0319 09:26:41.907801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:42 crc kubenswrapper[4835]: I0319 09:26:42.082635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:26:42 crc kubenswrapper[4835]: I0319 09:26:42.213902 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:42 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:42 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:42 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:42 crc kubenswrapper[4835]: I0319 09:26:42.213949 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:43 crc kubenswrapper[4835]: I0319 09:26:43.209057 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:43 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:43 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:43 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:43 crc kubenswrapper[4835]: I0319 09:26:43.209158 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:43 crc kubenswrapper[4835]: I0319 09:26:43.225384 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cl8p7" Mar 19 09:26:44 crc kubenswrapper[4835]: I0319 09:26:44.207876 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:44 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:44 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:44 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:44 crc kubenswrapper[4835]: I0319 09:26:44.208182 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:44 crc kubenswrapper[4835]: I0319 09:26:44.365021 4835 ???:1] "http: TLS handshake error from 192.168.126.11:51328: no serving certificate available for the kubelet" Mar 19 09:26:44 crc kubenswrapper[4835]: I0319 09:26:44.553469 4835 ???:1] "http: TLS handshake error from 192.168.126.11:51336: no serving certificate available for the kubelet" Mar 19 09:26:45 crc kubenswrapper[4835]: I0319 09:26:45.211806 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:45 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:45 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:45 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:45 crc kubenswrapper[4835]: I0319 09:26:45.211883 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:45 crc kubenswrapper[4835]: I0319 09:26:45.397803 4835 ???:1] "http: TLS handshake error from 192.168.126.11:51342: no serving certificate available for the kubelet" Mar 19 09:26:45 crc kubenswrapper[4835]: I0319 09:26:45.420667 4835 ???:1] "http: TLS handshake error from 192.168.126.11:51358: no serving certificate available for the kubelet" Mar 19 09:26:46 crc kubenswrapper[4835]: I0319 09:26:46.208345 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:46 crc kubenswrapper[4835]: [-]has-synced failed: reason withheld Mar 19 09:26:46 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:46 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:46 crc kubenswrapper[4835]: I0319 09:26:46.208647 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:47 crc kubenswrapper[4835]: I0319 09:26:47.207524 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 09:26:47 crc kubenswrapper[4835]: [+]has-synced ok Mar 19 09:26:47 crc kubenswrapper[4835]: [+]process-running ok Mar 19 09:26:47 crc kubenswrapper[4835]: healthz check failed Mar 19 09:26:47 crc kubenswrapper[4835]: I0319 09:26:47.207582 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 09:26:48 crc kubenswrapper[4835]: I0319 09:26:48.209663 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:48 crc kubenswrapper[4835]: I0319 09:26:48.218378 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.152482 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-bg8tj_49daa1c0-0b39-4ebf-9e81-38885e24bc4d/cluster-samples-operator/0.log" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.153073 4835 generic.go:334] "Generic (PLEG): container finished" podID="49daa1c0-0b39-4ebf-9e81-38885e24bc4d" containerID="f5d58db8922ca3e2ae8c693adb9c241bd8f77b73109c68c57b84fa0f1e18ab23" exitCode=2 Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.153134 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" event={"ID":"49daa1c0-0b39-4ebf-9e81-38885e24bc4d","Type":"ContainerDied","Data":"f5d58db8922ca3e2ae8c693adb9c241bd8f77b73109c68c57b84fa0f1e18ab23"} Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.154092 4835 scope.go:117] "RemoveContainer" containerID="f5d58db8922ca3e2ae8c693adb9c241bd8f77b73109c68c57b84fa0f1e18ab23" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.180642 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.180694 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.180642 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.180757 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.498529 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:51 crc kubenswrapper[4835]: I0319 09:26:51.502673 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:26:54 crc kubenswrapper[4835]: I0319 09:26:54.636972 4835 ???:1] "http: TLS handshake error from 192.168.126.11:52830: no serving certificate available for the kubelet" Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.036203 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.036666 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" containerID="cri-o://6a1cdb2db91d182c1de166db4dfcd7dd21b494e44530184a8725ae9ce8393e4d" gracePeriod=30 Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.050280 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.050466 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" containerID="cri-o://0701c836ac2f8ae5b7ede7658193e3172079a5ff756c5f55aa97b208e9a6b3f8" gracePeriod=30 Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.189911 4835 generic.go:334] "Generic (PLEG): container finished" podID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerID="6a1cdb2db91d182c1de166db4dfcd7dd21b494e44530184a8725ae9ce8393e4d" exitCode=0 Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.189967 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" event={"ID":"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1","Type":"ContainerDied","Data":"6a1cdb2db91d182c1de166db4dfcd7dd21b494e44530184a8725ae9ce8393e4d"} Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.192127 4835 generic.go:334] "Generic (PLEG): container finished" podID="854bbde7-af9e-4222-8aef-19f4c684510a" containerID="0701c836ac2f8ae5b7ede7658193e3172079a5ff756c5f55aa97b208e9a6b3f8" exitCode=0 Mar 19 09:26:57 crc kubenswrapper[4835]: I0319 09:26:57.192164 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" event={"ID":"854bbde7-af9e-4222-8aef-19f4c684510a","Type":"ContainerDied","Data":"0701c836ac2f8ae5b7ede7658193e3172079a5ff756c5f55aa97b208e9a6b3f8"} Mar 19 09:26:59 crc kubenswrapper[4835]: I0319 09:26:59.269190 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:27:00 crc kubenswrapper[4835]: I0319 09:27:00.437067 4835 patch_prober.go:28] interesting pod/controller-manager-5b45df9cb7-8wg6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 19 09:27:00 crc kubenswrapper[4835]: I0319 09:27:00.437118 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 19 09:27:00 crc kubenswrapper[4835]: I0319 09:27:00.451863 4835 patch_prober.go:28] interesting pod/route-controller-manager-55bb997f45-j4kms container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 19 09:27:00 crc kubenswrapper[4835]: I0319 09:27:00.451902 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 19 09:27:01 crc kubenswrapper[4835]: I0319 09:27:01.201876 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vzxqb" Mar 19 09:27:02 crc kubenswrapper[4835]: E0319 09:27:02.534380 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 09:27:02 crc kubenswrapper[4835]: E0319 09:27:02.534894 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:27:02 crc kubenswrapper[4835]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 09:27:02 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6cpm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565204-mwv2v_openshift-infra(930d85cd-ba00-4c27-b728-dbdeaab91ca5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 09:27:02 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:27:02 crc kubenswrapper[4835]: E0319 09:27:02.536253 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" Mar 19 09:27:03 crc kubenswrapper[4835]: E0319 09:27:03.008180 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 09:27:03 crc kubenswrapper[4835]: E0319 09:27:03.008354 4835 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 09:27:03 crc kubenswrapper[4835]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 09:27:03 crc kubenswrapper[4835]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ql2cr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565206-wgvsr_openshift-infra(c232f12f-f539-42a1-9eec-965d59127ca0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 09:27:03 crc kubenswrapper[4835]: > logger="UnhandledError" Mar 19 09:27:03 crc kubenswrapper[4835]: E0319 09:27:03.009513 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" Mar 19 09:27:03 crc kubenswrapper[4835]: E0319 09:27:03.234648 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" Mar 19 09:27:03 crc kubenswrapper[4835]: E0319 09:27:03.234671 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" Mar 19 09:27:06 crc kubenswrapper[4835]: I0319 09:27:06.422573 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:27:06 crc kubenswrapper[4835]: I0319 09:27:06.422974 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:27:11 crc kubenswrapper[4835]: I0319 09:27:11.437679 4835 patch_prober.go:28] interesting pod/controller-manager-5b45df9cb7-8wg6b container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:27:11 crc kubenswrapper[4835]: I0319 09:27:11.437761 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:27:11 crc kubenswrapper[4835]: I0319 09:27:11.451435 4835 patch_prober.go:28] interesting pod/route-controller-manager-55bb997f45-j4kms container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 09:27:11 crc kubenswrapper[4835]: I0319 09:27:11.451493 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 09:27:11 crc kubenswrapper[4835]: I0319 09:27:11.762871 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.014405 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.016149 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.018142 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.049758 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.049943 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-26jz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ds4dj_openshift-marketplace(f1a86135-77ed-4bdf-874d-0b141bff59bb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.051121 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ds4dj" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.126432 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.126781 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.190885 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.191088 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlpmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tclt5_openshift-marketplace(bb7424ec-f638-4894-8c3d-466b81f98c8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.192409 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tclt5" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.228286 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.228325 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.228452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.259670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: I0319 09:27:12.341078 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.703047 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.703349 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8t4xf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d6l6z_openshift-marketplace(db372943-349a-417e-af5e-588162580a5a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:12 crc kubenswrapper[4835]: E0319 09:27:12.707449 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d6l6z" podUID="db372943-349a-417e-af5e-588162580a5a" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.388777 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d6l6z" podUID="db372943-349a-417e-af5e-588162580a5a" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.388782 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tclt5" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.388787 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ds4dj" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.421683 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.421859 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gmkc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-65z7x_openshift-marketplace(35072c46-2bb8-4be2-9409-8780c5fa5717): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:14 crc kubenswrapper[4835]: E0319 09:27:14.423221 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-65z7x" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.800823 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.802490 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.813920 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.913465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.913570 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:17 crc kubenswrapper[4835]: I0319 09:27:17.913606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.014953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.015014 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.015033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.015115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.015153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.040145 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access\") pod \"installer-9-crc\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.165003 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.815768 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-65z7x" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.926804 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.927136 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78fwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mr45m_openshift-marketplace(de9aaba1-3f3b-4c79-ba35-14f28fbbf657): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.928417 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mr45m" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.936308 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.936459 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qt5lg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hk7rv_openshift-marketplace(2d5a1f24-78ad-4c43-bf96-47e2e23c1996): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.937985 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hk7rv" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.950278 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.950496 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5jmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-65hbx_openshift-marketplace(43fa0a19-2075-4d87-80cb-5e6c24854e5f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 09:27:18 crc kubenswrapper[4835]: E0319 09:27:18.951603 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-65hbx" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.979799 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:27:18 crc kubenswrapper[4835]: I0319 09:27:18.984115 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032135 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:19 crc kubenswrapper[4835]: E0319 09:27:19.032356 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032373 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: E0319 09:27:19.032384 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032390 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032479 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" containerName="route-controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032493 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" containerName="controller-manager" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.032857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034138 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert\") pod \"854bbde7-af9e-4222-8aef-19f4c684510a\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034201 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca\") pod \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034220 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert\") pod \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034270 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config\") pod \"854bbde7-af9e-4222-8aef-19f4c684510a\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034310 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca\") pod \"854bbde7-af9e-4222-8aef-19f4c684510a\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config\") pod \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034349 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles\") pod \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034371 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrrw6\" (UniqueName: \"kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6\") pod \"854bbde7-af9e-4222-8aef-19f4c684510a\" (UID: \"854bbde7-af9e-4222-8aef-19f4c684510a\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034398 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bh58\" (UniqueName: \"kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58\") pod \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\" (UID: \"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1\") " Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.034808 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca" (OuterVolumeSpecName: "client-ca") pod "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" (UID: "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.038840 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.039327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config" (OuterVolumeSpecName: "config") pod "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" (UID: "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.039791 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58" (OuterVolumeSpecName: "kube-api-access-6bh58") pod "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" (UID: "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1"). InnerVolumeSpecName "kube-api-access-6bh58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.040084 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca" (OuterVolumeSpecName: "client-ca") pod "854bbde7-af9e-4222-8aef-19f4c684510a" (UID: "854bbde7-af9e-4222-8aef-19f4c684510a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.040101 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" (UID: "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.043724 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" (UID: "1918ad9e-8b0c-47e8-afa9-2770aa7eabd1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.047179 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config" (OuterVolumeSpecName: "config") pod "854bbde7-af9e-4222-8aef-19f4c684510a" (UID: "854bbde7-af9e-4222-8aef-19f4c684510a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.047695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "854bbde7-af9e-4222-8aef-19f4c684510a" (UID: "854bbde7-af9e-4222-8aef-19f4c684510a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.051250 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6" (OuterVolumeSpecName: "kube-api-access-lrrw6") pod "854bbde7-af9e-4222-8aef-19f4c684510a" (UID: "854bbde7-af9e-4222-8aef-19f4c684510a"). InnerVolumeSpecName "kube-api-access-lrrw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135304 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135411 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjsz\" (UniqueName: \"kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135530 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135620 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135795 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135817 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135830 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135845 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrrw6\" (UniqueName: \"kubernetes.io/projected/854bbde7-af9e-4222-8aef-19f4c684510a-kube-api-access-lrrw6\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135856 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bh58\" (UniqueName: \"kubernetes.io/projected/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-kube-api-access-6bh58\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135868 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/854bbde7-af9e-4222-8aef-19f4c684510a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135880 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135890 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.135901 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854bbde7-af9e-4222-8aef-19f4c684510a-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.236401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjsz\" (UniqueName: \"kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.236665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.236691 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.236762 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.238270 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.238385 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.241050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.253345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjsz\" (UniqueName: \"kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz\") pod \"route-controller-manager-f8d8897d4-cgvrh\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.281979 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 09:27:19 crc kubenswrapper[4835]: W0319 09:27:19.291493 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7bc9ad36_9d27_4004_8588_e1e0f61dd3bd.slice/crio-388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa WatchSource:0}: Error finding container 388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa: Status 404 returned error can't find the container with id 388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.346658 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.347007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b" event={"ID":"1918ad9e-8b0c-47e8-afa9-2770aa7eabd1","Type":"ContainerDied","Data":"d2b54dfd4fed313442447b7042924a48ccea48c993b24d1f358aec5805ccfd80"} Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.347071 4835 scope.go:117] "RemoveContainer" containerID="6a1cdb2db91d182c1de166db4dfcd7dd21b494e44530184a8725ae9ce8393e4d" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.359709 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerStarted","Data":"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e"} Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.371691 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.386014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" event={"ID":"854bbde7-af9e-4222-8aef-19f4c684510a","Type":"ContainerDied","Data":"f4c6a3a4f33bd23bc745335320f01397fcf9d7082500ce5cabb27a3e17c315a0"} Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.386253 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.387539 4835 scope.go:117] "RemoveContainer" containerID="0701c836ac2f8ae5b7ede7658193e3172079a5ff756c5f55aa97b208e9a6b3f8" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.401985 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-bg8tj_49daa1c0-0b39-4ebf-9e81-38885e24bc4d/cluster-samples-operator/0.log" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.402048 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bg8tj" event={"ID":"49daa1c0-0b39-4ebf-9e81-38885e24bc4d","Type":"ContainerStarted","Data":"617f447b1efce6a7d10a6b35cdcc418d3cdf35144e3e15ca531b85612d3e08c6"} Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.404298 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.412016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd","Type":"ContainerStarted","Data":"388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa"} Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.413320 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.420064 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b45df9cb7-8wg6b"] Mar 19 09:27:19 crc kubenswrapper[4835]: E0319 09:27:19.434017 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hk7rv" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" Mar 19 09:27:19 crc kubenswrapper[4835]: E0319 09:27:19.434108 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mr45m" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" Mar 19 09:27:19 crc kubenswrapper[4835]: E0319 09:27:19.440600 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-65hbx" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.441706 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.443240 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.453998 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:27:19 crc kubenswrapper[4835]: W0319 09:27:19.455764 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0f3c01a2_88d9_432d_9f82_b1f7e2184ec2.slice/crio-cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8 WatchSource:0}: Error finding container cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8: Status 404 returned error can't find the container with id cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8 Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.460623 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55bb997f45-j4kms"] Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.872375 4835 csr.go:261] certificate signing request csr-99ddc is approved, waiting to be issued Mar 19 09:27:19 crc kubenswrapper[4835]: I0319 09:27:19.877180 4835 csr.go:257] certificate signing request csr-99ddc is issued Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.016947 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:20 crc kubenswrapper[4835]: W0319 09:27:20.040935 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcf04c6_80ef_4794_8426_429ebb561d17.slice/crio-f02e616d22fd4ebff2d919c1dd2ab5277369594cdd9438a4218ed69a6f27ec92 WatchSource:0}: Error finding container f02e616d22fd4ebff2d919c1dd2ab5277369594cdd9438a4218ed69a6f27ec92: Status 404 returned error can't find the container with id f02e616d22fd4ebff2d919c1dd2ab5277369594cdd9438a4218ed69a6f27ec92 Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.409937 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1918ad9e-8b0c-47e8-afa9-2770aa7eabd1" path="/var/lib/kubelet/pods/1918ad9e-8b0c-47e8-afa9-2770aa7eabd1/volumes" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.411187 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854bbde7-af9e-4222-8aef-19f4c684510a" path="/var/lib/kubelet/pods/854bbde7-af9e-4222-8aef-19f4c684510a/volumes" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.419181 4835 generic.go:334] "Generic (PLEG): container finished" podID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" containerID="e43b96a28373ef47a4d5307c6c19b78b1956bf0cdd3fb28db4161a622dd8fe28" exitCode=0 Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.419299 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" event={"ID":"930d85cd-ba00-4c27-b728-dbdeaab91ca5","Type":"ContainerDied","Data":"e43b96a28373ef47a4d5307c6c19b78b1956bf0cdd3fb28db4161a622dd8fe28"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.420482 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" event={"ID":"ebcf04c6-80ef-4794-8426-429ebb561d17","Type":"ContainerStarted","Data":"f8f0ce72a87056ec0d45655f687c0c0f8fecd0440ef332fc56a3881ffff3767c"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.420533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" event={"ID":"ebcf04c6-80ef-4794-8426-429ebb561d17","Type":"ContainerStarted","Data":"f02e616d22fd4ebff2d919c1dd2ab5277369594cdd9438a4218ed69a6f27ec92"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.420753 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.422829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2","Type":"ContainerStarted","Data":"736547d7309147a69b7b4a2bed1bb13887d098f1bbd7a77222bb67dbf396ec08"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.422876 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2","Type":"ContainerStarted","Data":"cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.425586 4835 generic.go:334] "Generic (PLEG): container finished" podID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerID="0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e" exitCode=0 Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.425637 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerDied","Data":"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.429444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9f043600-c25a-4e48-9385-0e031efeaa82","Type":"ContainerStarted","Data":"45bfcabea5298c75bb043dd29db1b2e404b2768d76485516f862b1953a24c8b0"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.430905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9f043600-c25a-4e48-9385-0e031efeaa82","Type":"ContainerStarted","Data":"4292c2d254ae8a0f654c7269bc4e7375078e8b6ea8383c865660d0cc52ad8c61"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.434319 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"134bb67e-3c0f-40e2-b87e-32d6dc6fd642","Type":"ContainerStarted","Data":"88a439eed5dbecd8aba679d256e6ac6d4660c3032dfba95331f837b4bdbc7f93"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.434371 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"134bb67e-3c0f-40e2-b87e-32d6dc6fd642","Type":"ContainerStarted","Data":"3d009a63c3bc48755843eefc01ff3143177ed2708e3a46c633de68bf03b03aba"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.436553 4835 generic.go:334] "Generic (PLEG): container finished" podID="7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" containerID="6ac385af619ba57d8613b9fc323ab9e46ad21e848a4ed7445a3b6d6ab1b0b3ca" exitCode=0 Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.442014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd","Type":"ContainerDied","Data":"6ac385af619ba57d8613b9fc323ab9e46ad21e848a4ed7445a3b6d6ab1b0b3ca"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.447583 4835 generic.go:334] "Generic (PLEG): container finished" podID="c232f12f-f539-42a1-9eec-965d59127ca0" containerID="78cef7c0223fab439cfa8e464d11640af709c74e290ea254dac785acc760ef48" exitCode=0 Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.447650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" event={"ID":"c232f12f-f539-42a1-9eec-965d59127ca0","Type":"ContainerDied","Data":"78cef7c0223fab439cfa8e464d11640af709c74e290ea254dac785acc760ef48"} Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.456921 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=39.45689945 podStartE2EDuration="39.45689945s" podCreationTimestamp="2026-03-19 09:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:20.454108243 +0000 UTC m=+295.302706890" watchObservedRunningTime="2026-03-19 09:27:20.45689945 +0000 UTC m=+295.305498047" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.493977 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.493959353 podStartE2EDuration="3.493959353s" podCreationTimestamp="2026-03-19 09:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:20.491923496 +0000 UTC m=+295.340522113" watchObservedRunningTime="2026-03-19 09:27:20.493959353 +0000 UTC m=+295.342557950" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.536449 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=9.536424684 podStartE2EDuration="9.536424684s" podCreationTimestamp="2026-03-19 09:27:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:20.531923779 +0000 UTC m=+295.380522366" watchObservedRunningTime="2026-03-19 09:27:20.536424684 +0000 UTC m=+295.385023261" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.563065 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" podStartSLOduration=3.563043914 podStartE2EDuration="3.563043914s" podCreationTimestamp="2026-03-19 09:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:20.559071754 +0000 UTC m=+295.407670421" watchObservedRunningTime="2026-03-19 09:27:20.563043914 +0000 UTC m=+295.411642501" Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.879006 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-19 01:29:54.216461619 +0000 UTC Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.879854 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7336h2m33.336612998s for next certificate rotation Mar 19 09:27:20 crc kubenswrapper[4835]: I0319 09:27:20.950854 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.140966 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.141593 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.144255 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.144640 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.144856 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.145130 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.145874 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.147899 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.152602 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.154313 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.189763 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.189801 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.189834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5khw\" (UniqueName: \"kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.189886 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.189913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.290564 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.290620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.290667 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.290690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.290729 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5khw\" (UniqueName: \"kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.291983 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.292119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.292466 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.298342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.307978 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5khw\" (UniqueName: \"kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw\") pod \"controller-manager-5bd4f58d68-spxct\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.456217 4835 generic.go:334] "Generic (PLEG): container finished" podID="9f043600-c25a-4e48-9385-0e031efeaa82" containerID="45bfcabea5298c75bb043dd29db1b2e404b2768d76485516f862b1953a24c8b0" exitCode=0 Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.456274 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9f043600-c25a-4e48-9385-0e031efeaa82","Type":"ContainerDied","Data":"45bfcabea5298c75bb043dd29db1b2e404b2768d76485516f862b1953a24c8b0"} Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.458948 4835 generic.go:334] "Generic (PLEG): container finished" podID="134bb67e-3c0f-40e2-b87e-32d6dc6fd642" containerID="88a439eed5dbecd8aba679d256e6ac6d4660c3032dfba95331f837b4bdbc7f93" exitCode=0 Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.459023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"134bb67e-3c0f-40e2-b87e-32d6dc6fd642","Type":"ContainerDied","Data":"88a439eed5dbecd8aba679d256e6ac6d4660c3032dfba95331f837b4bdbc7f93"} Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.462656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerStarted","Data":"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d"} Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.463704 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.517784 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgq9n" podStartSLOduration=3.9514242790000003 podStartE2EDuration="45.517767083s" podCreationTimestamp="2026-03-19 09:26:36 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.28021975 +0000 UTC m=+254.128818337" lastFinishedPulling="2026-03-19 09:27:20.846562554 +0000 UTC m=+295.695161141" observedRunningTime="2026-03-19 09:27:21.516521929 +0000 UTC m=+296.365120526" watchObservedRunningTime="2026-03-19 09:27:21.517767083 +0000 UTC m=+296.366365670" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.708881 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.779646 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.782032 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.815839 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.880178 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 06:39:42.125962614 +0000 UTC Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.880421 4835 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5973h12m20.245544229s for next certificate rotation Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.900876 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir\") pod \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.900952 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpm5\" (UniqueName: \"kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5\") pod \"930d85cd-ba00-4c27-b728-dbdeaab91ca5\" (UID: \"930d85cd-ba00-4c27-b728-dbdeaab91ca5\") " Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.900959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" (UID: "7bc9ad36-9d27-4004-8588-e1e0f61dd3bd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.900985 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access\") pod \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\" (UID: \"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd\") " Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.901055 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql2cr\" (UniqueName: \"kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr\") pod \"c232f12f-f539-42a1-9eec-965d59127ca0\" (UID: \"c232f12f-f539-42a1-9eec-965d59127ca0\") " Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.901495 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.906423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5" (OuterVolumeSpecName: "kube-api-access-6cpm5") pod "930d85cd-ba00-4c27-b728-dbdeaab91ca5" (UID: "930d85cd-ba00-4c27-b728-dbdeaab91ca5"). InnerVolumeSpecName "kube-api-access-6cpm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.909835 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" (UID: "7bc9ad36-9d27-4004-8588-e1e0f61dd3bd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:21 crc kubenswrapper[4835]: I0319 09:27:21.911304 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr" (OuterVolumeSpecName: "kube-api-access-ql2cr") pod "c232f12f-f539-42a1-9eec-965d59127ca0" (UID: "c232f12f-f539-42a1-9eec-965d59127ca0"). InnerVolumeSpecName "kube-api-access-ql2cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.002081 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bc9ad36-9d27-4004-8588-e1e0f61dd3bd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.002384 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql2cr\" (UniqueName: \"kubernetes.io/projected/c232f12f-f539-42a1-9eec-965d59127ca0-kube-api-access-ql2cr\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.002398 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cpm5\" (UniqueName: \"kubernetes.io/projected/930d85cd-ba00-4c27-b728-dbdeaab91ca5-kube-api-access-6cpm5\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.466724 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7bc9ad36-9d27-4004-8588-e1e0f61dd3bd","Type":"ContainerDied","Data":"388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa"} Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.466879 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388becf052c201aa59dcd9732dbf0442d72291d6eb473327246cdd5b4eab58aa" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.466967 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.477421 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" event={"ID":"c232f12f-f539-42a1-9eec-965d59127ca0","Type":"ContainerDied","Data":"bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672"} Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.477868 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed873ab9dff6c3747ab5baa078ee2bd060ed41ec9634631a0fe89e5794d3672" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.477429 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565206-wgvsr" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.480076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" event={"ID":"930d85cd-ba00-4c27-b728-dbdeaab91ca5","Type":"ContainerDied","Data":"a56a19988313ca9fac28e6354facd17b84fe2af0d4c29afa2bc9c9830761332c"} Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.480097 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a56a19988313ca9fac28e6354facd17b84fe2af0d4c29afa2bc9c9830761332c" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.480136 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565204-mwv2v" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.484307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" event={"ID":"694d23f4-8a5b-4ca2-8ad3-212dd8b53933","Type":"ContainerStarted","Data":"8bc299fe9a996322bfabb0614e3ec2310c86e027196cd9ac64c562d37fd01a41"} Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.484331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" event={"ID":"694d23f4-8a5b-4ca2-8ad3-212dd8b53933","Type":"ContainerStarted","Data":"f520a60ab8a14f1f776ea9f6716d260369a75382e95c3fef0d7ddd952ced0a92"} Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.484346 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.490906 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.539500 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" podStartSLOduration=5.539478976 podStartE2EDuration="5.539478976s" podCreationTimestamp="2026-03-19 09:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:22.517234797 +0000 UTC m=+297.365833384" watchObservedRunningTime="2026-03-19 09:27:22.539478976 +0000 UTC m=+297.388077573" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.788871 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.793825 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912564 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir\") pod \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912683 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access\") pod \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\" (UID: \"134bb67e-3c0f-40e2-b87e-32d6dc6fd642\") " Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912705 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "134bb67e-3c0f-40e2-b87e-32d6dc6fd642" (UID: "134bb67e-3c0f-40e2-b87e-32d6dc6fd642"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912789 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9f043600-c25a-4e48-9385-0e031efeaa82" (UID: "9f043600-c25a-4e48-9385-0e031efeaa82"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912727 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir\") pod \"9f043600-c25a-4e48-9385-0e031efeaa82\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.912885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access\") pod \"9f043600-c25a-4e48-9385-0e031efeaa82\" (UID: \"9f043600-c25a-4e48-9385-0e031efeaa82\") " Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.913105 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.913120 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9f043600-c25a-4e48-9385-0e031efeaa82-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.923903 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "134bb67e-3c0f-40e2-b87e-32d6dc6fd642" (UID: "134bb67e-3c0f-40e2-b87e-32d6dc6fd642"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:22 crc kubenswrapper[4835]: I0319 09:27:22.923950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9f043600-c25a-4e48-9385-0e031efeaa82" (UID: "9f043600-c25a-4e48-9385-0e031efeaa82"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.014052 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f043600-c25a-4e48-9385-0e031efeaa82-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.014090 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/134bb67e-3c0f-40e2-b87e-32d6dc6fd642-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.494035 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.494032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9f043600-c25a-4e48-9385-0e031efeaa82","Type":"ContainerDied","Data":"4292c2d254ae8a0f654c7269bc4e7375078e8b6ea8383c865660d0cc52ad8c61"} Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.494621 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4292c2d254ae8a0f654c7269bc4e7375078e8b6ea8383c865660d0cc52ad8c61" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.496072 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"134bb67e-3c0f-40e2-b87e-32d6dc6fd642","Type":"ContainerDied","Data":"3d009a63c3bc48755843eefc01ff3143177ed2708e3a46c633de68bf03b03aba"} Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.496106 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 09:27:23 crc kubenswrapper[4835]: I0319 09:27:23.496132 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d009a63c3bc48755843eefc01ff3143177ed2708e3a46c633de68bf03b03aba" Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.047564 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.048189 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.239388 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.519366 4835 generic.go:334] "Generic (PLEG): container finished" podID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerID="31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790" exitCode=0 Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.519406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerDied","Data":"31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790"} Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.522220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerStarted","Data":"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb"} Mar 19 09:27:27 crc kubenswrapper[4835]: I0319 09:27:27.569930 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:27:28 crc kubenswrapper[4835]: I0319 09:27:28.529954 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerStarted","Data":"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94"} Mar 19 09:27:28 crc kubenswrapper[4835]: I0319 09:27:28.532254 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerDied","Data":"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb"} Mar 19 09:27:28 crc kubenswrapper[4835]: I0319 09:27:28.532240 4835 generic.go:334] "Generic (PLEG): container finished" podID="db372943-349a-417e-af5e-588162580a5a" containerID="45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb" exitCode=0 Mar 19 09:27:28 crc kubenswrapper[4835]: I0319 09:27:28.549540 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ds4dj" podStartSLOduration=3.857919196 podStartE2EDuration="52.549519388s" podCreationTimestamp="2026-03-19 09:26:36 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.279375816 +0000 UTC m=+254.127974403" lastFinishedPulling="2026-03-19 09:27:27.970976008 +0000 UTC m=+302.819574595" observedRunningTime="2026-03-19 09:27:28.548167561 +0000 UTC m=+303.396766148" watchObservedRunningTime="2026-03-19 09:27:28.549519388 +0000 UTC m=+303.398117985" Mar 19 09:27:30 crc kubenswrapper[4835]: I0319 09:27:30.546084 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerStarted","Data":"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9"} Mar 19 09:27:30 crc kubenswrapper[4835]: I0319 09:27:30.565868 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d6l6z" podStartSLOduration=3.28674648 podStartE2EDuration="53.565843161s" podCreationTimestamp="2026-03-19 09:26:37 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.265347995 +0000 UTC m=+254.113946582" lastFinishedPulling="2026-03-19 09:27:29.544444676 +0000 UTC m=+304.393043263" observedRunningTime="2026-03-19 09:27:30.562625361 +0000 UTC m=+305.411223948" watchObservedRunningTime="2026-03-19 09:27:30.565843161 +0000 UTC m=+305.414441768" Mar 19 09:27:31 crc kubenswrapper[4835]: I0319 09:27:31.569292 4835 generic.go:334] "Generic (PLEG): container finished" podID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerID="dd79cc9a6c395770af0f4dbbf672b67c9aa2eb05852ec299bbd8d6188fbaebae" exitCode=0 Mar 19 09:27:31 crc kubenswrapper[4835]: I0319 09:27:31.570538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerDied","Data":"dd79cc9a6c395770af0f4dbbf672b67c9aa2eb05852ec299bbd8d6188fbaebae"} Mar 19 09:27:31 crc kubenswrapper[4835]: I0319 09:27:31.585004 4835 generic.go:334] "Generic (PLEG): container finished" podID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerID="c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4" exitCode=0 Mar 19 09:27:31 crc kubenswrapper[4835]: I0319 09:27:31.585101 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerDied","Data":"c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4"} Mar 19 09:27:31 crc kubenswrapper[4835]: I0319 09:27:31.597711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerStarted","Data":"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9"} Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.604021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerStarted","Data":"d679c2f4a7177052049b575eecf0120daafa85b8d555bea3d30d6446ddb281f5"} Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.606350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerStarted","Data":"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90"} Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.608266 4835 generic.go:334] "Generic (PLEG): container finished" podID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerID="165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9" exitCode=0 Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.608331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerDied","Data":"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9"} Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.631203 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mr45m" podStartSLOduration=1.9418526 podStartE2EDuration="53.631182986s" podCreationTimestamp="2026-03-19 09:26:39 +0000 UTC" firstStartedPulling="2026-03-19 09:26:40.307727833 +0000 UTC m=+255.156326420" lastFinishedPulling="2026-03-19 09:27:31.997058229 +0000 UTC m=+306.845656806" observedRunningTime="2026-03-19 09:27:32.628863841 +0000 UTC m=+307.477462458" watchObservedRunningTime="2026-03-19 09:27:32.631182986 +0000 UTC m=+307.479781583" Mar 19 09:27:32 crc kubenswrapper[4835]: I0319 09:27:32.670928 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tclt5" podStartSLOduration=2.890170552 podStartE2EDuration="55.670911881s" podCreationTimestamp="2026-03-19 09:26:37 +0000 UTC" firstStartedPulling="2026-03-19 09:26:39.258660069 +0000 UTC m=+254.107258656" lastFinishedPulling="2026-03-19 09:27:32.039401398 +0000 UTC m=+306.887999985" observedRunningTime="2026-03-19 09:27:32.667588639 +0000 UTC m=+307.516187226" watchObservedRunningTime="2026-03-19 09:27:32.670911881 +0000 UTC m=+307.519510468" Mar 19 09:27:33 crc kubenswrapper[4835]: I0319 09:27:33.614621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerStarted","Data":"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb"} Mar 19 09:27:33 crc kubenswrapper[4835]: I0319 09:27:33.617377 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerStarted","Data":"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd"} Mar 19 09:27:33 crc kubenswrapper[4835]: I0319 09:27:33.617538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerStarted","Data":"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439"} Mar 19 09:27:33 crc kubenswrapper[4835]: I0319 09:27:33.657900 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-65z7x" podStartSLOduration=12.697617538 podStartE2EDuration="54.657881668s" podCreationTimestamp="2026-03-19 09:26:39 +0000 UTC" firstStartedPulling="2026-03-19 09:26:51.123626195 +0000 UTC m=+265.972224782" lastFinishedPulling="2026-03-19 09:27:33.083890285 +0000 UTC m=+307.932488912" observedRunningTime="2026-03-19 09:27:33.655732708 +0000 UTC m=+308.504331305" watchObservedRunningTime="2026-03-19 09:27:33.657881668 +0000 UTC m=+308.506480255" Mar 19 09:27:34 crc kubenswrapper[4835]: I0319 09:27:34.625056 4835 generic.go:334] "Generic (PLEG): container finished" podID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerID="6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb" exitCode=0 Mar 19 09:27:34 crc kubenswrapper[4835]: I0319 09:27:34.625139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerDied","Data":"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb"} Mar 19 09:27:34 crc kubenswrapper[4835]: I0319 09:27:34.628198 4835 generic.go:334] "Generic (PLEG): container finished" podID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerID="78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd" exitCode=0 Mar 19 09:27:34 crc kubenswrapper[4835]: I0319 09:27:34.628241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerDied","Data":"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd"} Mar 19 09:27:35 crc kubenswrapper[4835]: I0319 09:27:35.638187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerStarted","Data":"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a"} Mar 19 09:27:35 crc kubenswrapper[4835]: I0319 09:27:35.640659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerStarted","Data":"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7"} Mar 19 09:27:35 crc kubenswrapper[4835]: I0319 09:27:35.655247 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-65hbx" podStartSLOduration=11.407763885 podStartE2EDuration="55.655233502s" podCreationTimestamp="2026-03-19 09:26:40 +0000 UTC" firstStartedPulling="2026-03-19 09:26:51.124078939 +0000 UTC m=+265.972677526" lastFinishedPulling="2026-03-19 09:27:35.371548546 +0000 UTC m=+310.220147143" observedRunningTime="2026-03-19 09:27:35.653882613 +0000 UTC m=+310.502481200" watchObservedRunningTime="2026-03-19 09:27:35.655233502 +0000 UTC m=+310.503832089" Mar 19 09:27:35 crc kubenswrapper[4835]: I0319 09:27:35.671780 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hk7rv" podStartSLOduration=1.957433285 podStartE2EDuration="56.671764432s" podCreationTimestamp="2026-03-19 09:26:39 +0000 UTC" firstStartedPulling="2026-03-19 09:26:40.311614992 +0000 UTC m=+255.160213569" lastFinishedPulling="2026-03-19 09:27:35.025946129 +0000 UTC m=+309.874544716" observedRunningTime="2026-03-19 09:27:35.669771156 +0000 UTC m=+310.518369743" watchObservedRunningTime="2026-03-19 09:27:35.671764432 +0000 UTC m=+310.520363019" Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.421970 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.422233 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.422272 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.422770 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.422831 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3" gracePeriod=600 Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.650413 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3" exitCode=0 Mar 19 09:27:36 crc kubenswrapper[4835]: I0319 09:27:36.650463 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3"} Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.073165 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.073719 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" podUID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" containerName="controller-manager" containerID="cri-o://8bc299fe9a996322bfabb0614e3ec2310c86e027196cd9ac64c562d37fd01a41" gracePeriod=30 Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.104911 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.105500 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" podUID="ebcf04c6-80ef-4794-8426-429ebb561d17" containerName="route-controller-manager" containerID="cri-o://f8f0ce72a87056ec0d45655f687c0c0f8fecd0440ef332fc56a3881ffff3767c" gracePeriod=30 Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.227732 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.228007 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.265259 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.420108 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.420200 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.471935 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.656119 4835 generic.go:334] "Generic (PLEG): container finished" podID="ebcf04c6-80ef-4794-8426-429ebb561d17" containerID="f8f0ce72a87056ec0d45655f687c0c0f8fecd0440ef332fc56a3881ffff3767c" exitCode=0 Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.656196 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" event={"ID":"ebcf04c6-80ef-4794-8426-429ebb561d17","Type":"ContainerDied","Data":"f8f0ce72a87056ec0d45655f687c0c0f8fecd0440ef332fc56a3881ffff3767c"} Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.657719 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294"} Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.658883 4835 generic.go:334] "Generic (PLEG): container finished" podID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" containerID="8bc299fe9a996322bfabb0614e3ec2310c86e027196cd9ac64c562d37fd01a41" exitCode=0 Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.659358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" event={"ID":"694d23f4-8a5b-4ca2-8ad3-212dd8b53933","Type":"ContainerDied","Data":"8bc299fe9a996322bfabb0614e3ec2310c86e027196cd9ac64c562d37fd01a41"} Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.699935 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.700778 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.726442 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.726488 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:37 crc kubenswrapper[4835]: I0319 09:27:37.775540 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.128980 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.159794 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w"] Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160039 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160060 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160079 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134bb67e-3c0f-40e2-b87e-32d6dc6fd642" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160278 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="134bb67e-3c0f-40e2-b87e-32d6dc6fd642" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160291 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebcf04c6-80ef-4794-8426-429ebb561d17" containerName="route-controller-manager" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160300 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebcf04c6-80ef-4794-8426-429ebb561d17" containerName="route-controller-manager" Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160316 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160324 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160338 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f043600-c25a-4e48-9385-0e031efeaa82" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160346 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f043600-c25a-4e48-9385-0e031efeaa82" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: E0319 09:27:38.160362 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160370 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160485 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="134bb67e-3c0f-40e2-b87e-32d6dc6fd642" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160498 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160507 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc9ad36-9d27-4004-8588-e1e0f61dd3bd" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160529 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f043600-c25a-4e48-9385-0e031efeaa82" containerName="pruner" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160539 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebcf04c6-80ef-4794-8426-429ebb561d17" containerName="route-controller-manager" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160550 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" containerName="oc" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.160966 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.181790 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w"] Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.182958 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302180 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert\") pod \"ebcf04c6-80ef-4794-8426-429ebb561d17\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302264 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca\") pod \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302297 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config\") pod \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302316 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert\") pod \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302332 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles\") pod \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5khw\" (UniqueName: \"kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw\") pod \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\" (UID: \"694d23f4-8a5b-4ca2-8ad3-212dd8b53933\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302384 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjsz\" (UniqueName: \"kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz\") pod \"ebcf04c6-80ef-4794-8426-429ebb561d17\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302414 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config\") pod \"ebcf04c6-80ef-4794-8426-429ebb561d17\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302433 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca\") pod \"ebcf04c6-80ef-4794-8426-429ebb561d17\" (UID: \"ebcf04c6-80ef-4794-8426-429ebb561d17\") " Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbk5w\" (UniqueName: \"kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302692 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.302738 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.303012 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca" (OuterVolumeSpecName: "client-ca") pod "694d23f4-8a5b-4ca2-8ad3-212dd8b53933" (UID: "694d23f4-8a5b-4ca2-8ad3-212dd8b53933"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.303156 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "694d23f4-8a5b-4ca2-8ad3-212dd8b53933" (UID: "694d23f4-8a5b-4ca2-8ad3-212dd8b53933"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.303220 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config" (OuterVolumeSpecName: "config") pod "ebcf04c6-80ef-4794-8426-429ebb561d17" (UID: "ebcf04c6-80ef-4794-8426-429ebb561d17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.303251 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebcf04c6-80ef-4794-8426-429ebb561d17" (UID: "ebcf04c6-80ef-4794-8426-429ebb561d17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.303662 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config" (OuterVolumeSpecName: "config") pod "694d23f4-8a5b-4ca2-8ad3-212dd8b53933" (UID: "694d23f4-8a5b-4ca2-8ad3-212dd8b53933"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.307882 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebcf04c6-80ef-4794-8426-429ebb561d17" (UID: "ebcf04c6-80ef-4794-8426-429ebb561d17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.307882 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "694d23f4-8a5b-4ca2-8ad3-212dd8b53933" (UID: "694d23f4-8a5b-4ca2-8ad3-212dd8b53933"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.307930 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz" (OuterVolumeSpecName: "kube-api-access-rsjsz") pod "ebcf04c6-80ef-4794-8426-429ebb561d17" (UID: "ebcf04c6-80ef-4794-8426-429ebb561d17"). InnerVolumeSpecName "kube-api-access-rsjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.308497 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw" (OuterVolumeSpecName: "kube-api-access-k5khw") pod "694d23f4-8a5b-4ca2-8ad3-212dd8b53933" (UID: "694d23f4-8a5b-4ca2-8ad3-212dd8b53933"). InnerVolumeSpecName "kube-api-access-k5khw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.403461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.403513 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.403775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404096 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbk5w\" (UniqueName: \"kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404202 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebcf04c6-80ef-4794-8426-429ebb561d17-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404224 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404238 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404249 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404261 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404274 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5khw\" (UniqueName: \"kubernetes.io/projected/694d23f4-8a5b-4ca2-8ad3-212dd8b53933-kube-api-access-k5khw\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404286 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsjsz\" (UniqueName: \"kubernetes.io/projected/ebcf04c6-80ef-4794-8426-429ebb561d17-kube-api-access-rsjsz\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404299 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404309 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebcf04c6-80ef-4794-8426-429ebb561d17-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404582 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.404934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.411314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.428763 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbk5w\" (UniqueName: \"kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w\") pod \"route-controller-manager-845749c45c-mns8w\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.500728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.666286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" event={"ID":"694d23f4-8a5b-4ca2-8ad3-212dd8b53933","Type":"ContainerDied","Data":"f520a60ab8a14f1f776ea9f6716d260369a75382e95c3fef0d7ddd952ced0a92"} Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.666337 4835 scope.go:117] "RemoveContainer" containerID="8bc299fe9a996322bfabb0614e3ec2310c86e027196cd9ac64c562d37fd01a41" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.666455 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bd4f58d68-spxct" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.669948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" event={"ID":"ebcf04c6-80ef-4794-8426-429ebb561d17","Type":"ContainerDied","Data":"f02e616d22fd4ebff2d919c1dd2ab5277369594cdd9438a4218ed69a6f27ec92"} Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.670033 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.708142 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.715950 4835 scope.go:117] "RemoveContainer" containerID="f8f0ce72a87056ec0d45655f687c0c0f8fecd0440ef332fc56a3881ffff3767c" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.717157 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bd4f58d68-spxct"] Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.731891 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.735428 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.742840 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8d8897d4-cgvrh"] Mar 19 09:27:38 crc kubenswrapper[4835]: I0319 09:27:38.753612 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w"] Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.385648 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.385699 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.434002 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.636566 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.681373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" event={"ID":"9422c9c6-1386-4362-ba32-a791956ada58","Type":"ContainerStarted","Data":"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699"} Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.681434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" event={"ID":"9422c9c6-1386-4362-ba32-a791956ada58","Type":"ContainerStarted","Data":"bb17254bae23e3abf0bb7ad99cc50542f763a228c07df81bd5bd2932832f270f"} Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.681535 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tclt5" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="registry-server" containerID="cri-o://f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90" gracePeriod=2 Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.706619 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" podStartSLOduration=2.7065886949999998 podStartE2EDuration="2.706588695s" podCreationTimestamp="2026-03-19 09:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:39.702813009 +0000 UTC m=+314.551411656" watchObservedRunningTime="2026-03-19 09:27:39.706588695 +0000 UTC m=+314.555187322" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.772878 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.773146 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:39 crc kubenswrapper[4835]: I0319 09:27:39.812759 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.089804 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.126626 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlpmv\" (UniqueName: \"kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv\") pod \"bb7424ec-f638-4894-8c3d-466b81f98c8a\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.126677 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities\") pod \"bb7424ec-f638-4894-8c3d-466b81f98c8a\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.126726 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content\") pod \"bb7424ec-f638-4894-8c3d-466b81f98c8a\" (UID: \"bb7424ec-f638-4894-8c3d-466b81f98c8a\") " Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.128547 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities" (OuterVolumeSpecName: "utilities") pod "bb7424ec-f638-4894-8c3d-466b81f98c8a" (UID: "bb7424ec-f638-4894-8c3d-466b81f98c8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.133709 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv" (OuterVolumeSpecName: "kube-api-access-nlpmv") pod "bb7424ec-f638-4894-8c3d-466b81f98c8a" (UID: "bb7424ec-f638-4894-8c3d-466b81f98c8a"). InnerVolumeSpecName "kube-api-access-nlpmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.165734 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp"] Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.165995 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="extract-utilities" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166008 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="extract-utilities" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.166019 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="registry-server" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166025 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="registry-server" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.166044 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" containerName="controller-manager" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166052 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" containerName="controller-manager" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.166062 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="extract-content" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166069 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="extract-content" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166153 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" containerName="controller-manager" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166165 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerName="registry-server" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.166550 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.168933 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.171181 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp"] Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.173732 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.174147 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.174349 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.174673 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.175662 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.180670 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.182561 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.182614 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.206957 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb7424ec-f638-4894-8c3d-466b81f98c8a" (UID: "bb7424ec-f638-4894-8c3d-466b81f98c8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.222573 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.228096 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.228124 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlpmv\" (UniqueName: \"kubernetes.io/projected/bb7424ec-f638-4894-8c3d-466b81f98c8a-kube-api-access-nlpmv\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.228137 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7424ec-f638-4894-8c3d-466b81f98c8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.328886 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.329010 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv4sn\" (UniqueName: \"kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.329462 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.329546 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.329574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.381906 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.382166 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.410369 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="694d23f4-8a5b-4ca2-8ad3-212dd8b53933" path="/var/lib/kubelet/pods/694d23f4-8a5b-4ca2-8ad3-212dd8b53933/volumes" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.411798 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebcf04c6-80ef-4794-8426-429ebb561d17" path="/var/lib/kubelet/pods/ebcf04c6-80ef-4794-8426-429ebb561d17/volumes" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.431070 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv4sn\" (UniqueName: \"kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.431137 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.431168 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.431186 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.431221 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.432270 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.432560 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.433366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.442643 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.444989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv4sn\" (UniqueName: \"kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn\") pod \"controller-manager-5b7f7f46b7-xt9dp\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.492468 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.691276 4835 generic.go:334] "Generic (PLEG): container finished" podID="bb7424ec-f638-4894-8c3d-466b81f98c8a" containerID="f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90" exitCode=0 Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.691866 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerDied","Data":"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90"} Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.691921 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tclt5" event={"ID":"bb7424ec-f638-4894-8c3d-466b81f98c8a","Type":"ContainerDied","Data":"2b18800640c597eb7ebd2eb578df404e062dd7c5e476a37bd9824fb1ab124317"} Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.691941 4835 scope.go:117] "RemoveContainer" containerID="f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.692098 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tclt5" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.693539 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.702625 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.712530 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.724016 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tclt5"] Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.730428 4835 scope.go:117] "RemoveContainer" containerID="c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.733799 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp"] Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.758306 4835 scope.go:117] "RemoveContainer" containerID="9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d" Mar 19 09:27:40 crc kubenswrapper[4835]: W0319 09:27:40.761103 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56945271_3324_4200_a0e8_36fcef33cd2a.slice/crio-1d6e5d558b5e0e5eb83b6e0571866c68a0f9c63c910e3ec2a9e1c9256db50d15 WatchSource:0}: Error finding container 1d6e5d558b5e0e5eb83b6e0571866c68a0f9c63c910e3ec2a9e1c9256db50d15: Status 404 returned error can't find the container with id 1d6e5d558b5e0e5eb83b6e0571866c68a0f9c63c910e3ec2a9e1c9256db50d15 Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.761310 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.781468 4835 scope.go:117] "RemoveContainer" containerID="f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.782668 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90\": container with ID starting with f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90 not found: ID does not exist" containerID="f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.782690 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90"} err="failed to get container status \"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90\": rpc error: code = NotFound desc = could not find container \"f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90\": container with ID starting with f768e9967664a2db9edf53c204063d17d8a8b05cd0cfe4b11f0231ff328b8f90 not found: ID does not exist" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.782708 4835 scope.go:117] "RemoveContainer" containerID="c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.783078 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4\": container with ID starting with c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4 not found: ID does not exist" containerID="c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.783096 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4"} err="failed to get container status \"c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4\": rpc error: code = NotFound desc = could not find container \"c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4\": container with ID starting with c128ef2d7d571646b1f660a77503f74254ca565dabef81148de41253be5abee4 not found: ID does not exist" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.783106 4835 scope.go:117] "RemoveContainer" containerID="9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d" Mar 19 09:27:40 crc kubenswrapper[4835]: E0319 09:27:40.783292 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d\": container with ID starting with 9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d not found: ID does not exist" containerID="9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.783310 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d"} err="failed to get container status \"9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d\": rpc error: code = NotFound desc = could not find container \"9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d\": container with ID starting with 9cadd727e4e1d4801688ff2ebd58315bfaeb2df9d6dbcbe91b106e6b73afe69d not found: ID does not exist" Mar 19 09:27:40 crc kubenswrapper[4835]: I0319 09:27:40.783878 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.431469 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-65hbx" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="registry-server" probeResult="failure" output=< Mar 19 09:27:41 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:27:41 crc kubenswrapper[4835]: > Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.704464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" event={"ID":"56945271-3324-4200-a0e8-36fcef33cd2a","Type":"ContainerStarted","Data":"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750"} Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.704494 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" event={"ID":"56945271-3324-4200-a0e8-36fcef33cd2a","Type":"ContainerStarted","Data":"1d6e5d558b5e0e5eb83b6e0571866c68a0f9c63c910e3ec2a9e1c9256db50d15"} Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.705077 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.713176 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.745126 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" podStartSLOduration=4.745108905 podStartE2EDuration="4.745108905s" podCreationTimestamp="2026-03-19 09:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:27:41.728017538 +0000 UTC m=+316.576616135" watchObservedRunningTime="2026-03-19 09:27:41.745108905 +0000 UTC m=+316.593707492" Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.835326 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:27:41 crc kubenswrapper[4835]: I0319 09:27:41.835543 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d6l6z" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="registry-server" containerID="cri-o://85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9" gracePeriod=2 Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.349186 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.411062 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7424ec-f638-4894-8c3d-466b81f98c8a" path="/var/lib/kubelet/pods/bb7424ec-f638-4894-8c3d-466b81f98c8a/volumes" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.452600 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities\") pod \"db372943-349a-417e-af5e-588162580a5a\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.452641 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content\") pod \"db372943-349a-417e-af5e-588162580a5a\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.452678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t4xf\" (UniqueName: \"kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf\") pod \"db372943-349a-417e-af5e-588162580a5a\" (UID: \"db372943-349a-417e-af5e-588162580a5a\") " Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.454641 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities" (OuterVolumeSpecName: "utilities") pod "db372943-349a-417e-af5e-588162580a5a" (UID: "db372943-349a-417e-af5e-588162580a5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.461575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf" (OuterVolumeSpecName: "kube-api-access-8t4xf") pod "db372943-349a-417e-af5e-588162580a5a" (UID: "db372943-349a-417e-af5e-588162580a5a"). InnerVolumeSpecName "kube-api-access-8t4xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.534059 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db372943-349a-417e-af5e-588162580a5a" (UID: "db372943-349a-417e-af5e-588162580a5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.553820 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.554075 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db372943-349a-417e-af5e-588162580a5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.554214 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t4xf\" (UniqueName: \"kubernetes.io/projected/db372943-349a-417e-af5e-588162580a5a-kube-api-access-8t4xf\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.710411 4835 generic.go:334] "Generic (PLEG): container finished" podID="db372943-349a-417e-af5e-588162580a5a" containerID="85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9" exitCode=0 Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.710587 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerDied","Data":"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9"} Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.712038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d6l6z" event={"ID":"db372943-349a-417e-af5e-588162580a5a","Type":"ContainerDied","Data":"fe74eeba78b0abbd2785ee43305725167126b71d1216e5ee863e1bc0f24acc3e"} Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.712075 4835 scope.go:117] "RemoveContainer" containerID="85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.710707 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d6l6z" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.738328 4835 scope.go:117] "RemoveContainer" containerID="45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.767369 4835 scope.go:117] "RemoveContainer" containerID="0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.768366 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.778517 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d6l6z"] Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.804869 4835 scope.go:117] "RemoveContainer" containerID="85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9" Mar 19 09:27:42 crc kubenswrapper[4835]: E0319 09:27:42.805460 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9\": container with ID starting with 85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9 not found: ID does not exist" containerID="85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.805513 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9"} err="failed to get container status \"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9\": rpc error: code = NotFound desc = could not find container \"85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9\": container with ID starting with 85c1491e64077076bf435c1d7650956199cce6e4f844ab02701fd6e0494902a9 not found: ID does not exist" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.805542 4835 scope.go:117] "RemoveContainer" containerID="45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb" Mar 19 09:27:42 crc kubenswrapper[4835]: E0319 09:27:42.805984 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb\": container with ID starting with 45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb not found: ID does not exist" containerID="45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.806027 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb"} err="failed to get container status \"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb\": rpc error: code = NotFound desc = could not find container \"45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb\": container with ID starting with 45d37c159a8454244d939d7d493d482b5f1dfabd1c505f09b9dcfff458ba56bb not found: ID does not exist" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.806053 4835 scope.go:117] "RemoveContainer" containerID="0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390" Mar 19 09:27:42 crc kubenswrapper[4835]: E0319 09:27:42.806568 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390\": container with ID starting with 0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390 not found: ID does not exist" containerID="0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390" Mar 19 09:27:42 crc kubenswrapper[4835]: I0319 09:27:42.806594 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390"} err="failed to get container status \"0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390\": rpc error: code = NotFound desc = could not find container \"0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390\": container with ID starting with 0c5f73276b1f6430b25a4fce44a5f5d2b3abf19484db5dc9548c5e22dad82390 not found: ID does not exist" Mar 19 09:27:44 crc kubenswrapper[4835]: I0319 09:27:44.041037 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:27:44 crc kubenswrapper[4835]: I0319 09:27:44.042677 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mr45m" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="registry-server" containerID="cri-o://d679c2f4a7177052049b575eecf0120daafa85b8d555bea3d30d6446ddb281f5" gracePeriod=2 Mar 19 09:27:44 crc kubenswrapper[4835]: I0319 09:27:44.413914 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db372943-349a-417e-af5e-588162580a5a" path="/var/lib/kubelet/pods/db372943-349a-417e-af5e-588162580a5a/volumes" Mar 19 09:27:45 crc kubenswrapper[4835]: I0319 09:27:45.736816 4835 generic.go:334] "Generic (PLEG): container finished" podID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerID="d679c2f4a7177052049b575eecf0120daafa85b8d555bea3d30d6446ddb281f5" exitCode=0 Mar 19 09:27:45 crc kubenswrapper[4835]: I0319 09:27:45.737816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerDied","Data":"d679c2f4a7177052049b575eecf0120daafa85b8d555bea3d30d6446ddb281f5"} Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.268444 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.418110 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities\") pod \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.418243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78fwt\" (UniqueName: \"kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt\") pod \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.418288 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content\") pod \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\" (UID: \"de9aaba1-3f3b-4c79-ba35-14f28fbbf657\") " Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.419835 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities" (OuterVolumeSpecName: "utilities") pod "de9aaba1-3f3b-4c79-ba35-14f28fbbf657" (UID: "de9aaba1-3f3b-4c79-ba35-14f28fbbf657"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.430092 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt" (OuterVolumeSpecName: "kube-api-access-78fwt") pod "de9aaba1-3f3b-4c79-ba35-14f28fbbf657" (UID: "de9aaba1-3f3b-4c79-ba35-14f28fbbf657"). InnerVolumeSpecName "kube-api-access-78fwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.452314 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de9aaba1-3f3b-4c79-ba35-14f28fbbf657" (UID: "de9aaba1-3f3b-4c79-ba35-14f28fbbf657"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.520017 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.520083 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.520107 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78fwt\" (UniqueName: \"kubernetes.io/projected/de9aaba1-3f3b-4c79-ba35-14f28fbbf657-kube-api-access-78fwt\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.746702 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr45m" event={"ID":"de9aaba1-3f3b-4c79-ba35-14f28fbbf657","Type":"ContainerDied","Data":"fa8adbcb4f64c9a6994d1ae6b3d335b7b24858032a4c1abeac46ecdd3bf23d15"} Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.746799 4835 scope.go:117] "RemoveContainer" containerID="d679c2f4a7177052049b575eecf0120daafa85b8d555bea3d30d6446ddb281f5" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.746880 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr45m" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.771949 4835 scope.go:117] "RemoveContainer" containerID="dd79cc9a6c395770af0f4dbbf672b67c9aa2eb05852ec299bbd8d6188fbaebae" Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.802711 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.807567 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr45m"] Mar 19 09:27:46 crc kubenswrapper[4835]: I0319 09:27:46.809889 4835 scope.go:117] "RemoveContainer" containerID="2f86417ed996ad9036a899c2bfb75ade718562815c409d3ad3d0478a6229ddcc" Mar 19 09:27:48 crc kubenswrapper[4835]: I0319 09:27:48.408893 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" path="/var/lib/kubelet/pods/de9aaba1-3f3b-4c79-ba35-14f28fbbf657/volumes" Mar 19 09:27:49 crc kubenswrapper[4835]: I0319 09:27:49.440319 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:27:49 crc kubenswrapper[4835]: I0319 09:27:49.991809 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-svxz5"] Mar 19 09:27:50 crc kubenswrapper[4835]: I0319 09:27:50.429215 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:50 crc kubenswrapper[4835]: I0319 09:27:50.480604 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:52 crc kubenswrapper[4835]: I0319 09:27:52.844604 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:27:52 crc kubenswrapper[4835]: I0319 09:27:52.844982 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-65hbx" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="registry-server" containerID="cri-o://312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a" gracePeriod=2 Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.345322 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.513884 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content\") pod \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.514031 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities\") pod \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.514072 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5jmm\" (UniqueName: \"kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm\") pod \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\" (UID: \"43fa0a19-2075-4d87-80cb-5e6c24854e5f\") " Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.515096 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities" (OuterVolumeSpecName: "utilities") pod "43fa0a19-2075-4d87-80cb-5e6c24854e5f" (UID: "43fa0a19-2075-4d87-80cb-5e6c24854e5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.521528 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm" (OuterVolumeSpecName: "kube-api-access-h5jmm") pod "43fa0a19-2075-4d87-80cb-5e6c24854e5f" (UID: "43fa0a19-2075-4d87-80cb-5e6c24854e5f"). InnerVolumeSpecName "kube-api-access-h5jmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.615361 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.615406 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5jmm\" (UniqueName: \"kubernetes.io/projected/43fa0a19-2075-4d87-80cb-5e6c24854e5f-kube-api-access-h5jmm\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.647809 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43fa0a19-2075-4d87-80cb-5e6c24854e5f" (UID: "43fa0a19-2075-4d87-80cb-5e6c24854e5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.716078 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43fa0a19-2075-4d87-80cb-5e6c24854e5f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.800656 4835 generic.go:334] "Generic (PLEG): container finished" podID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerID="312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a" exitCode=0 Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.800705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerDied","Data":"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a"} Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.800771 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65hbx" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.800791 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65hbx" event={"ID":"43fa0a19-2075-4d87-80cb-5e6c24854e5f","Type":"ContainerDied","Data":"227137ec04f02a194e7741f3ca9b518a76545afcfe38d7bf771c0f04aff08907"} Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.800828 4835 scope.go:117] "RemoveContainer" containerID="312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.816120 4835 scope.go:117] "RemoveContainer" containerID="6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.834228 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.834977 4835 scope.go:117] "RemoveContainer" containerID="374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.838892 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-65hbx"] Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.850535 4835 scope.go:117] "RemoveContainer" containerID="312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a" Mar 19 09:27:53 crc kubenswrapper[4835]: E0319 09:27:53.851775 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a\": container with ID starting with 312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a not found: ID does not exist" containerID="312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.851821 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a"} err="failed to get container status \"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a\": rpc error: code = NotFound desc = could not find container \"312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a\": container with ID starting with 312dde906229d6232e00141b10af71274f08c678b019de26115e928a9426168a not found: ID does not exist" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.851846 4835 scope.go:117] "RemoveContainer" containerID="6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb" Mar 19 09:27:53 crc kubenswrapper[4835]: E0319 09:27:53.852352 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb\": container with ID starting with 6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb not found: ID does not exist" containerID="6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.852394 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb"} err="failed to get container status \"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb\": rpc error: code = NotFound desc = could not find container \"6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb\": container with ID starting with 6d7dce15690c21a0ed60cd0b6e54b4eb8d043436f9af5ec5729128d6fad9b6eb not found: ID does not exist" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.852427 4835 scope.go:117] "RemoveContainer" containerID="374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12" Mar 19 09:27:53 crc kubenswrapper[4835]: E0319 09:27:53.852727 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12\": container with ID starting with 374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12 not found: ID does not exist" containerID="374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12" Mar 19 09:27:53 crc kubenswrapper[4835]: I0319 09:27:53.852790 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12"} err="failed to get container status \"374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12\": rpc error: code = NotFound desc = could not find container \"374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12\": container with ID starting with 374849e46ffb89a75259b1aea5b268c3cb82f53375459d67e762bccba0522c12 not found: ID does not exist" Mar 19 09:27:54 crc kubenswrapper[4835]: I0319 09:27:54.411885 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" path="/var/lib/kubelet/pods/43fa0a19-2075-4d87-80cb-5e6c24854e5f/volumes" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.083263 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp"] Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.084099 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" containerName="controller-manager" containerID="cri-o://a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750" gracePeriod=30 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.162136 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w"] Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.162420 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" podUID="9422c9c6-1386-4362-ba32-a791956ada58" containerName="route-controller-manager" containerID="cri-o://8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699" gracePeriod=30 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.548343 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596270 4835 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596525 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596552 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596574 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596588 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596635 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596650 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596670 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596683 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596705 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" containerName="controller-manager" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596718 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" containerName="controller-manager" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596732 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596766 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596785 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596798 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596820 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596832 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="extract-utilities" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596852 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596865 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="extract-content" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.596881 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.596894 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597069 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9aaba1-3f3b-4c79-ba35-14f28fbbf657" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597096 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" containerName="controller-manager" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597115 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="db372943-349a-417e-af5e-588162580a5a" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597133 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fa0a19-2075-4d87-80cb-5e6c24854e5f" containerName="registry-server" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597603 4835 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597642 4835 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.597930 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597955 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.597962 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.597976 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598208 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598255 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598269 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598286 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598299 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598322 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598336 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598354 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598367 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598391 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598406 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.598422 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598434 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598631 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598649 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598690 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd" gracePeriod=15 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598649 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb" gracePeriod=15 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598312 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be" gracePeriod=15 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598842 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d" gracePeriod=15 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598898 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438" gracePeriod=15 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598699 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598964 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.598986 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599003 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599016 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599025 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.599204 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599219 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.599248 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599259 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.599381 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.602851 4835 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.642230 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.660200 4835 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.116:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.664263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca\") pod \"56945271-3324-4200-a0e8-36fcef33cd2a\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.664331 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles\") pod \"56945271-3324-4200-a0e8-36fcef33cd2a\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.664423 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv4sn\" (UniqueName: \"kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn\") pod \"56945271-3324-4200-a0e8-36fcef33cd2a\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.664455 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert\") pod \"56945271-3324-4200-a0e8-36fcef33cd2a\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.664509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config\") pod \"56945271-3324-4200-a0e8-36fcef33cd2a\" (UID: \"56945271-3324-4200-a0e8-36fcef33cd2a\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.665480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca" (OuterVolumeSpecName: "client-ca") pod "56945271-3324-4200-a0e8-36fcef33cd2a" (UID: "56945271-3324-4200-a0e8-36fcef33cd2a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.665574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config" (OuterVolumeSpecName: "config") pod "56945271-3324-4200-a0e8-36fcef33cd2a" (UID: "56945271-3324-4200-a0e8-36fcef33cd2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.665604 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "56945271-3324-4200-a0e8-36fcef33cd2a" (UID: "56945271-3324-4200-a0e8-36fcef33cd2a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.672406 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "56945271-3324-4200-a0e8-36fcef33cd2a" (UID: "56945271-3324-4200-a0e8-36fcef33cd2a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.672615 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn" (OuterVolumeSpecName: "kube-api-access-dv4sn") pod "56945271-3324-4200-a0e8-36fcef33cd2a" (UID: "56945271-3324-4200-a0e8-36fcef33cd2a"). InnerVolumeSpecName "kube-api-access-dv4sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766351 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca\") pod \"9422c9c6-1386-4362-ba32-a791956ada58\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766447 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert\") pod \"9422c9c6-1386-4362-ba32-a791956ada58\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbk5w\" (UniqueName: \"kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w\") pod \"9422c9c6-1386-4362-ba32-a791956ada58\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766572 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config\") pod \"9422c9c6-1386-4362-ba32-a791956ada58\" (UID: \"9422c9c6-1386-4362-ba32-a791956ada58\") " Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766783 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766832 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766887 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.766978 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767016 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767063 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767079 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767092 4835 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56945271-3324-4200-a0e8-36fcef33cd2a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767105 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv4sn\" (UniqueName: \"kubernetes.io/projected/56945271-3324-4200-a0e8-36fcef33cd2a-kube-api-access-dv4sn\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767113 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56945271-3324-4200-a0e8-36fcef33cd2a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config" (OuterVolumeSpecName: "config") pod "9422c9c6-1386-4362-ba32-a791956ada58" (UID: "9422c9c6-1386-4362-ba32-a791956ada58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.767387 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca" (OuterVolumeSpecName: "client-ca") pod "9422c9c6-1386-4362-ba32-a791956ada58" (UID: "9422c9c6-1386-4362-ba32-a791956ada58"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.769237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w" (OuterVolumeSpecName: "kube-api-access-cbk5w") pod "9422c9c6-1386-4362-ba32-a791956ada58" (UID: "9422c9c6-1386-4362-ba32-a791956ada58"). InnerVolumeSpecName "kube-api-access-cbk5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.769826 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9422c9c6-1386-4362-ba32-a791956ada58" (UID: "9422c9c6-1386-4362-ba32-a791956ada58"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.826820 4835 generic.go:334] "Generic (PLEG): container finished" podID="56945271-3324-4200-a0e8-36fcef33cd2a" containerID="a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.826899 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" event={"ID":"56945271-3324-4200-a0e8-36fcef33cd2a","Type":"ContainerDied","Data":"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750"} Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.826908 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.826929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" event={"ID":"56945271-3324-4200-a0e8-36fcef33cd2a","Type":"ContainerDied","Data":"1d6e5d558b5e0e5eb83b6e0571866c68a0f9c63c910e3ec2a9e1c9256db50d15"} Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.826951 4835 scope.go:117] "RemoveContainer" containerID="a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.829687 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.830930 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.831825 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.831855 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.831867 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.831878 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438" exitCode=2 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.836940 4835 generic.go:334] "Generic (PLEG): container finished" podID="9422c9c6-1386-4362-ba32-a791956ada58" containerID="8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.837022 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" event={"ID":"9422c9c6-1386-4362-ba32-a791956ada58","Type":"ContainerDied","Data":"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699"} Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.837059 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" event={"ID":"9422c9c6-1386-4362-ba32-a791956ada58","Type":"ContainerDied","Data":"bb17254bae23e3abf0bb7ad99cc50542f763a228c07df81bd5bd2932832f270f"} Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.837130 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.850449 4835 generic.go:334] "Generic (PLEG): container finished" podID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" containerID="736547d7309147a69b7b4a2bed1bb13887d098f1bbd7a77222bb67dbf396ec08" exitCode=0 Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.850490 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2","Type":"ContainerDied","Data":"736547d7309147a69b7b4a2bed1bb13887d098f1bbd7a77222bb67dbf396ec08"} Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.867875 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.867923 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.867950 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.867974 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868022 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868089 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868187 4835 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868200 4835 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9422c9c6-1386-4362-ba32-a791956ada58-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868213 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbk5w\" (UniqueName: \"kubernetes.io/projected/9422c9c6-1386-4362-ba32-a791956ada58-kube-api-access-cbk5w\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868227 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9422c9c6-1386-4362-ba32-a791956ada58-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868272 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868426 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.868517 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.877051 4835 scope.go:117] "RemoveContainer" containerID="a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.877410 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750\": container with ID starting with a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750 not found: ID does not exist" containerID="a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.877469 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750"} err="failed to get container status \"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750\": rpc error: code = NotFound desc = could not find container \"a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750\": container with ID starting with a873e42ae660f76017065aabdd0db796ab04d30dae06a9b2a994c85ec8e03750 not found: ID does not exist" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.877494 4835 scope.go:117] "RemoveContainer" containerID="3e8b71841021ecb05090f294c4c911448941e8298b79721dbdee54dfdeeb63af" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.904804 4835 scope.go:117] "RemoveContainer" containerID="8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.925680 4835 scope.go:117] "RemoveContainer" containerID="8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699" Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.926980 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699\": container with ID starting with 8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699 not found: ID does not exist" containerID="8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.927015 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699"} err="failed to get container status \"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699\": rpc error: code = NotFound desc = could not find container \"8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699\": container with ID starting with 8a363152f09eb7f93e85c8480e4cefa5169e3e8dc93b46d53905712e572f0699 not found: ID does not exist" Mar 19 09:27:57 crc kubenswrapper[4835]: I0319 09:27:57.961333 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:57 crc kubenswrapper[4835]: W0319 09:27:57.992348 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d4f60f13adf7b1b8c97e334df37653e00c8bdb07cd3a1a480f5827501f0c94a4 WatchSource:0}: Error finding container d4f60f13adf7b1b8c97e334df37653e00c8bdb07cd3a1a480f5827501f0c94a4: Status 404 returned error can't find the container with id d4f60f13adf7b1b8c97e334df37653e00c8bdb07cd3a1a480f5827501f0c94a4 Mar 19 09:27:57 crc kubenswrapper[4835]: E0319 09:27:57.995076 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.116:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e34000980b74d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,LastTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:27:58 crc kubenswrapper[4835]: E0319 09:27:58.688909 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.116:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e34000980b74d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,LastTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:27:58 crc kubenswrapper[4835]: I0319 09:27:58.861700 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 09:27:58 crc kubenswrapper[4835]: I0319 09:27:58.869726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558"} Mar 19 09:27:58 crc kubenswrapper[4835]: I0319 09:27:58.869859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4f60f13adf7b1b8c97e334df37653e00c8bdb07cd3a1a480f5827501f0c94a4"} Mar 19 09:27:58 crc kubenswrapper[4835]: E0319 09:27:58.870940 4835 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.116:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.040281 4835 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.040798 4835 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.041271 4835 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.041888 4835 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.042269 4835 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.042312 4835 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.042626 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="200ms" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.123690 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.183024 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock\") pod \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.183514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access\") pod \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.183547 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir\") pod \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\" (UID: \"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2\") " Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.183165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock" (OuterVolumeSpecName: "var-lock") pod "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" (UID: "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.183815 4835 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.184291 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" (UID: "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.189816 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" (UID: "0f3c01a2-88d9-432d-9f82-b1f7e2184ec2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.244203 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="400ms" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.284678 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.284713 4835 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0f3c01a2-88d9-432d-9f82-b1f7e2184ec2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:27:59 crc kubenswrapper[4835]: E0319 09:27:59.645086 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="800ms" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.877071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0f3c01a2-88d9-432d-9f82-b1f7e2184ec2","Type":"ContainerDied","Data":"cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8"} Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.877109 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee1adb26d80d8dcd28913d45b924770ffc093c8c04adb0d6cf13cc04b18f2c8" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.877981 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.952528 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 09:27:59 crc kubenswrapper[4835]: I0319 09:27:59.953831 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001209 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001293 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001388 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001453 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001551 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001713 4835 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001829 4835 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.001848 4835 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.413089 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.446427 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="1.6s" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.887104 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.887826 4835 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be" exitCode=0 Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.887905 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.887909 4835 scope.go:117] "RemoveContainer" containerID="f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.901827 4835 scope.go:117] "RemoveContainer" containerID="5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.923826 4835 scope.go:117] "RemoveContainer" containerID="94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.937004 4835 scope.go:117] "RemoveContainer" containerID="b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.953354 4835 scope.go:117] "RemoveContainer" containerID="460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.970073 4835 scope.go:117] "RemoveContainer" containerID="9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.991092 4835 scope.go:117] "RemoveContainer" containerID="f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.991872 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\": container with ID starting with f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb not found: ID does not exist" containerID="f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.991930 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb"} err="failed to get container status \"f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\": rpc error: code = NotFound desc = could not find container \"f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb\": container with ID starting with f2beefeb377fef1b7e3b25a6d26543480dc6fc669407412379eb44d4146fb4cb not found: ID does not exist" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.991966 4835 scope.go:117] "RemoveContainer" containerID="5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.992511 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\": container with ID starting with 5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd not found: ID does not exist" containerID="5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.992554 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd"} err="failed to get container status \"5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\": rpc error: code = NotFound desc = could not find container \"5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd\": container with ID starting with 5046c4bcce033e1965ee5dfcfbfbd0464fc222683ac5da18248a3255faebbedd not found: ID does not exist" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.992586 4835 scope.go:117] "RemoveContainer" containerID="94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.993975 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\": container with ID starting with 94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d not found: ID does not exist" containerID="94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.994002 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d"} err="failed to get container status \"94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\": rpc error: code = NotFound desc = could not find container \"94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d\": container with ID starting with 94fc2567b0329d545aeeec9e32a75618c0d973debcb2a71c7d408c5899c69c3d not found: ID does not exist" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.994019 4835 scope.go:117] "RemoveContainer" containerID="b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.994672 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\": container with ID starting with b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438 not found: ID does not exist" containerID="b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.994712 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438"} err="failed to get container status \"b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\": rpc error: code = NotFound desc = could not find container \"b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438\": container with ID starting with b958152b9244e23e0d53cc53a888b853c6c8f1020229010cea3e7b0a95236438 not found: ID does not exist" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.994770 4835 scope.go:117] "RemoveContainer" containerID="460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.995383 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\": container with ID starting with 460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be not found: ID does not exist" containerID="460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.995421 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be"} err="failed to get container status \"460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\": rpc error: code = NotFound desc = could not find container \"460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be\": container with ID starting with 460324f71c4ef8bb1256dfdbeeb6d5af982f2e6b1957f5136832a7cf7b4854be not found: ID does not exist" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.995440 4835 scope.go:117] "RemoveContainer" containerID="9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a" Mar 19 09:28:00 crc kubenswrapper[4835]: E0319 09:28:00.995831 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\": container with ID starting with 9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a not found: ID does not exist" containerID="9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a" Mar 19 09:28:00 crc kubenswrapper[4835]: I0319 09:28:00.995893 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a"} err="failed to get container status \"9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\": rpc error: code = NotFound desc = could not find container \"9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a\": container with ID starting with 9bbbbebb146f9382305e554c31da6d777cd04291ccbc37fef1c3d7fb8d05848a not found: ID does not exist" Mar 19 09:28:02 crc kubenswrapper[4835]: E0319 09:28:02.047382 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="3.2s" Mar 19 09:28:02 crc kubenswrapper[4835]: I0319 09:28:02.644635 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:02 crc kubenswrapper[4835]: I0319 09:28:02.645304 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:02 crc kubenswrapper[4835]: I0319 09:28:02.645848 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:02 crc kubenswrapper[4835]: I0319 09:28:02.646303 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:05 crc kubenswrapper[4835]: E0319 09:28:05.248987 4835 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" interval="6.4s" Mar 19 09:28:06 crc kubenswrapper[4835]: I0319 09:28:06.406921 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:06 crc kubenswrapper[4835]: I0319 09:28:06.408216 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:06 crc kubenswrapper[4835]: I0319 09:28:06.408827 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:08 crc kubenswrapper[4835]: E0319 09:28:08.690908 4835 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.116:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e34000980b74d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,LastTimestamp:2026-03-19 09:27:57.994628941 +0000 UTC m=+332.843227528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.969657 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.970306 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.970372 4835 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582" exitCode=1 Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.970406 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582"} Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.970924 4835 scope.go:117] "RemoveContainer" containerID="be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.971260 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.971728 4835 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.972061 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:09 crc kubenswrapper[4835]: I0319 09:28:09.972558 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.297441 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:10Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:10Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:10Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T09:28:10Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.297894 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.298351 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.298925 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.299193 4835 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.299228 4835 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.401358 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.402302 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.402929 4835 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.403907 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.404479 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.427001 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.427038 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.427498 4835 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.428219 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:10 crc kubenswrapper[4835]: W0319 09:28:10.459770 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2a5dd4f2fea94ed1f8f79890584103f9254a1f00c3657a5ba8864c78f07ffba1 WatchSource:0}: Error finding container 2a5dd4f2fea94ed1f8f79890584103f9254a1f00c3657a5ba8864c78f07ffba1: Status 404 returned error can't find the container with id 2a5dd4f2fea94ed1f8f79890584103f9254a1f00c3657a5ba8864c78f07ffba1 Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.742984 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.743252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.743278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.743301 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:28:10 crc kubenswrapper[4835]: W0319 09:28:10.743727 4835 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27305": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.743801 4835 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27305\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:28:10 crc kubenswrapper[4835]: W0319 09:28:10.743914 4835 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27301": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.743990 4835 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27301\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:28:10 crc kubenswrapper[4835]: W0319 09:28:10.744474 4835 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27301": dial tcp 38.129.56.116:6443: connect: connection refused Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.744585 4835 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27301\": dial tcp 38.129.56.116:6443: connect: connection refused" logger="UnhandledError" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.986351 4835 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1869f66857965f5943dc34ff9fef67f2c3d2a0abc4815fafef771775bb9d6206" exitCode=0 Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.986453 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1869f66857965f5943dc34ff9fef67f2c3d2a0abc4815fafef771775bb9d6206"} Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.986506 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2a5dd4f2fea94ed1f8f79890584103f9254a1f00c3657a5ba8864c78f07ffba1"} Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.986796 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.986811 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:10 crc kubenswrapper[4835]: E0319 09:28:10.987273 4835 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.987411 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.987847 4835 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.988370 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.989042 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.992711 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.994873 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.994952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f2efa3a47d25b8aa6684842dffbe85b4ff6415a97200a889a1924982a465c725"} Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.996076 4835 status_manager.go:851] "Failed to get status for pod" podUID="9422c9c6-1386-4362-ba32-a791956ada58" pod="openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-845749c45c-mns8w\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.996646 4835 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.997241 4835 status_manager.go:851] "Failed to get status for pod" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:10 crc kubenswrapper[4835]: I0319 09:28:10.997785 4835 status_manager.go:851] "Failed to get status for pod" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" pod="openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5b7f7f46b7-xt9dp\": dial tcp 38.129.56.116:6443: connect: connection refused" Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744185 4835 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744184 4835 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744277 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744191 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744267 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:30:13.744246818 +0000 UTC m=+468.592845415 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 19 09:28:11 crc kubenswrapper[4835]: E0319 09:28:11.744392 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 09:30:13.744381532 +0000 UTC m=+468.592980139 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: I0319 09:28:12.002449 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2af5910441514532d9a5c8a0caac9c323014e1fb64cd3a9efa438a68deb94949"} Mar 19 09:28:12 crc kubenswrapper[4835]: I0319 09:28:12.002488 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e2fd4a890539d8c67cc613bb86c317d765c040baeb6410f41c1997a043aaa3e3"} Mar 19 09:28:12 crc kubenswrapper[4835]: I0319 09:28:12.002503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4792473babb6440be19e8b1f4082d8af30fdfa579892ee5b81975aff24a7a839"} Mar 19 09:28:12 crc kubenswrapper[4835]: I0319 09:28:12.002516 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a531da69c08c490d8669f0cd8350ee4360a8d53e4bab212859d71036e953634"} Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.744616 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.744888 4835 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.744950 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 09:30:14.744932213 +0000 UTC m=+469.593530800 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.744651 4835 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.744994 4835 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:12 crc kubenswrapper[4835]: E0319 09:28:12.745072 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 09:30:14.745050637 +0000 UTC m=+469.593649244 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 19 09:28:13 crc kubenswrapper[4835]: I0319 09:28:13.009215 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2471e2eb1a48a87d5acd694422ec30c81e1568f6c6b355f4161d540031845467"} Mar 19 09:28:13 crc kubenswrapper[4835]: I0319 09:28:13.009390 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:13 crc kubenswrapper[4835]: I0319 09:28:13.009478 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:13 crc kubenswrapper[4835]: I0319 09:28:13.009501 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.017869 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerName="oauth-openshift" containerID="cri-o://f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda" gracePeriod=15 Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.427533 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.428288 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.428459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.434919 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592457 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592506 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592538 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skj6b\" (UniqueName: \"kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592561 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592585 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592606 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592635 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592669 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592692 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592713 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592767 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592801 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.592830 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig\") pod \"d111a8cb-8053-4b32-a9d0-8325de3c057f\" (UID: \"d111a8cb-8053-4b32-a9d0-8325de3c057f\") " Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.594575 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.594800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.595238 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.596422 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.598515 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.601665 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.603239 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.603302 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.603876 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.604013 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b" (OuterVolumeSpecName: "kube-api-access-skj6b") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "kube-api-access-skj6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.604377 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.604505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.605364 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.606211 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d111a8cb-8053-4b32-a9d0-8325de3c057f" (UID: "d111a8cb-8053-4b32-a9d0-8325de3c057f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696077 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696107 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696116 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696127 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skj6b\" (UniqueName: \"kubernetes.io/projected/d111a8cb-8053-4b32-a9d0-8325de3c057f-kube-api-access-skj6b\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696136 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696145 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696155 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696166 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696176 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696187 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696199 4835 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696210 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696220 4835 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d111a8cb-8053-4b32-a9d0-8325de3c057f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:15 crc kubenswrapper[4835]: I0319 09:28:15.696228 4835 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d111a8cb-8053-4b32-a9d0-8325de3c057f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.031264 4835 generic.go:334] "Generic (PLEG): container finished" podID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerID="f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda" exitCode=0 Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.031329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" event={"ID":"d111a8cb-8053-4b32-a9d0-8325de3c057f","Type":"ContainerDied","Data":"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda"} Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.031338 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.031364 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-svxz5" event={"ID":"d111a8cb-8053-4b32-a9d0-8325de3c057f","Type":"ContainerDied","Data":"7bc97b3569d6412ac3275138bcc0c81b25039672811df1af1fe35da5b0bed00a"} Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.031390 4835 scope.go:117] "RemoveContainer" containerID="f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda" Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.062675 4835 scope.go:117] "RemoveContainer" containerID="f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda" Mar 19 09:28:16 crc kubenswrapper[4835]: E0319 09:28:16.063505 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda\": container with ID starting with f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda not found: ID does not exist" containerID="f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda" Mar 19 09:28:16 crc kubenswrapper[4835]: I0319 09:28:16.063787 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda"} err="failed to get container status \"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda\": rpc error: code = NotFound desc = could not find container \"f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda\": container with ID starting with f6f89675ccc43e38f5190c1534c4c3262a798fc5848a20049d240a562d766eda not found: ID does not exist" Mar 19 09:28:17 crc kubenswrapper[4835]: I0319 09:28:17.625379 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 09:28:17 crc kubenswrapper[4835]: I0319 09:28:17.664396 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 09:28:17 crc kubenswrapper[4835]: I0319 09:28:17.817635 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.019358 4835 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.046524 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.046557 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.052064 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.141206 4835 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb16b745-840d-4981-b7bd-bbcb6676cdcb" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.161165 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.387243 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 09:28:18 crc kubenswrapper[4835]: E0319 09:28:18.629499 4835 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.854666 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.854966 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 09:28:18 crc kubenswrapper[4835]: I0319 09:28:18.855177 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 09:28:19 crc kubenswrapper[4835]: I0319 09:28:19.053083 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:19 crc kubenswrapper[4835]: I0319 09:28:19.053132 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:19 crc kubenswrapper[4835]: I0319 09:28:19.061088 4835 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb16b745-840d-4981-b7bd-bbcb6676cdcb" Mar 19 09:28:26 crc kubenswrapper[4835]: E0319 09:28:26.431708 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 09:28:26 crc kubenswrapper[4835]: E0319 09:28:26.454830 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 09:28:26 crc kubenswrapper[4835]: E0319 09:28:26.465311 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 09:28:26 crc kubenswrapper[4835]: I0319 09:28:26.958566 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 09:28:27 crc kubenswrapper[4835]: I0319 09:28:27.848560 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.449833 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.540564 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.540576 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.825590 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.855853 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.855943 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 09:28:28 crc kubenswrapper[4835]: I0319 09:28:28.873434 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.019835 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.281107 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.315700 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.509399 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.592826 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.725905 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.740818 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.786002 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 09:28:29 crc kubenswrapper[4835]: I0319 09:28:29.817300 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 09:28:30 crc kubenswrapper[4835]: I0319 09:28:30.143640 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 09:28:30 crc kubenswrapper[4835]: I0319 09:28:30.257438 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 09:28:30 crc kubenswrapper[4835]: I0319 09:28:30.623533 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 09:28:30 crc kubenswrapper[4835]: I0319 09:28:30.889097 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.112191 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.462983 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.504276 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.559453 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.560093 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.779681 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.780448 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.813086 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 09:28:31 crc kubenswrapper[4835]: I0319 09:28:31.910941 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.014354 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.021780 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.059331 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.125769 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.242365 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.362203 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.495325 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.523577 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.564152 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.594421 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.664138 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.744291 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.785682 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 09:28:32 crc kubenswrapper[4835]: I0319 09:28:32.971578 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.068450 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.069633 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.362065 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.430072 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.542350 4835 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.600295 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.742131 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.780202 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.802939 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 09:28:33 crc kubenswrapper[4835]: I0319 09:28:33.916257 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.111989 4835 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119149 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-svxz5","openshift-controller-manager/controller-manager-5b7f7f46b7-xt9dp","openshift-route-controller-manager/route-controller-manager-845749c45c-mns8w"] Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119244 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-5675974fc9-xqhf6"] Mar 19 09:28:34 crc kubenswrapper[4835]: E0319 09:28:34.119502 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" containerName="installer" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119531 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" containerName="installer" Mar 19 09:28:34 crc kubenswrapper[4835]: E0319 09:28:34.119556 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerName="oauth-openshift" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119570 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerName="oauth-openshift" Mar 19 09:28:34 crc kubenswrapper[4835]: E0319 09:28:34.119586 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9422c9c6-1386-4362-ba32-a791956ada58" containerName="route-controller-manager" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119599 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9422c9c6-1386-4362-ba32-a791956ada58" containerName="route-controller-manager" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119648 4835 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119677 4835 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6b56d6f1-6518-4341-8c9d-3026798d33ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119800 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" containerName="oauth-openshift" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119827 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9422c9c6-1386-4362-ba32-a791956ada58" containerName="route-controller-manager" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.119845 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3c01a2-88d9-432d-9f82-b1f7e2184ec2" containerName="installer" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.120443 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.122177 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.128609 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.130146 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.130205 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.130340 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.130496 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.132020 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.137636 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.138658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-proxy-ca-bundles\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.138858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-client-ca\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.138925 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2kb\" (UniqueName: \"kubernetes.io/projected/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-kube-api-access-td2kb\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.138986 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-serving-cert\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.139042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-config\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.139931 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.158983 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.158955978 podStartE2EDuration="16.158955978s" podCreationTimestamp="2026-03-19 09:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:34.158067515 +0000 UTC m=+369.006666142" watchObservedRunningTime="2026-03-19 09:28:34.158955978 +0000 UTC m=+369.007554605" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.240140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-proxy-ca-bundles\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.240478 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-client-ca\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.240519 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td2kb\" (UniqueName: \"kubernetes.io/projected/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-kube-api-access-td2kb\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.240587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-serving-cert\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.240632 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-config\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.242329 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-config\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.243292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-proxy-ca-bundles\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.245133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-client-ca\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.249934 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-serving-cert\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.274876 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2kb\" (UniqueName: \"kubernetes.io/projected/d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46-kube-api-access-td2kb\") pod \"controller-manager-5675974fc9-xqhf6\" (UID: \"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46\") " pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.278634 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.306529 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.379452 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.402681 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.412440 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56945271-3324-4200-a0e8-36fcef33cd2a" path="/var/lib/kubelet/pods/56945271-3324-4200-a0e8-36fcef33cd2a/volumes" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.413717 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9422c9c6-1386-4362-ba32-a791956ada58" path="/var/lib/kubelet/pods/9422c9c6-1386-4362-ba32-a791956ada58/volumes" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.414862 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d111a8cb-8053-4b32-a9d0-8325de3c057f" path="/var/lib/kubelet/pods/d111a8cb-8053-4b32-a9d0-8325de3c057f/volumes" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.448689 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.503709 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.741832 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.868311 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 09:28:34 crc kubenswrapper[4835]: I0319 09:28:34.912885 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.087160 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.142684 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.172315 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.241178 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.357430 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.364134 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.436019 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.452228 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.453702 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.545805 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.546732 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.603245 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.622168 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.661370 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.721799 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.798293 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.830105 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.877056 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 09:28:35 crc kubenswrapper[4835]: I0319 09:28:35.920594 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.084123 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.163735 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.305224 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.309998 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.352899 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.503039 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.515012 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.573669 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.645525 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.758541 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.814255 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.831802 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.942012 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 09:28:36 crc kubenswrapper[4835]: I0319 09:28:36.983181 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.006815 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.327964 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.329891 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.536680 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.622176 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.652328 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.818305 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 09:28:37 crc kubenswrapper[4835]: I0319 09:28:37.875333 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.070856 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.146841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.153426 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.184087 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.253936 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.287234 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.315331 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.324852 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.364376 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.370885 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.447181 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.547373 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.551487 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.553113 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.565314 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.720817 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.803991 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.841691 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.854783 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.854868 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.854930 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.855658 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"f2efa3a47d25b8aa6684842dffbe85b4ff6415a97200a889a1924982a465c725"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 09:28:38 crc kubenswrapper[4835]: I0319 09:28:38.855821 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://f2efa3a47d25b8aa6684842dffbe85b4ff6415a97200a889a1924982a465c725" gracePeriod=30 Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.033305 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.039543 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.092586 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.131323 4835 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.171810 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.221805 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.305270 4835 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.349978 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.389656 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.401339 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.447580 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.448077 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.486238 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.612521 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.681386 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.736935 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.808908 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 09:28:39 crc kubenswrapper[4835]: I0319 09:28:39.963625 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.022223 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.071651 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.098223 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.170643 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.219598 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.258599 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.380808 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.395475 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.410475 4835 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.410786 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558" gracePeriod=5 Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.427482 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.455595 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.460120 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.554017 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.570874 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.576356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.763653 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.795735 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.836677 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.843720 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.853167 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.917346 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.946799 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 09:28:40 crc kubenswrapper[4835]: I0319 09:28:40.984179 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.226877 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.261327 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.296076 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.313052 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.401877 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.502763 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.528888 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.536658 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.555964 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.690762 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.703366 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.709096 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.718657 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.870173 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.875189 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.909851 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.932874 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 09:28:41 crc kubenswrapper[4835]: I0319 09:28:41.980591 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.007502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5675974fc9-xqhf6"] Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.027603 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.043375 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.064303 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.065538 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.099753 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.157819 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.188761 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.249242 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.334796 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.338336 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.367499 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.404464 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.423491 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.442007 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.442874 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.516847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5675974fc9-xqhf6"] Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.517922 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.519712 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.601438 4835 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.710281 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 09:28:42 crc kubenswrapper[4835]: I0319 09:28:42.895567 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.167285 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.253019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" event={"ID":"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46","Type":"ContainerStarted","Data":"9d6b5b62d82043260aad7652065797450dc874fcd88254e57a1b198017ae6817"} Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.253071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" event={"ID":"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46","Type":"ContainerStarted","Data":"b40fa88e5bbbf2defb71fa79bb9f0e58f1aeea74ba8cb43249c1c6e4435e4f5f"} Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.253342 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.258631 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.269910 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podStartSLOduration=46.269892652 podStartE2EDuration="46.269892652s" podCreationTimestamp="2026-03-19 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:43.266900571 +0000 UTC m=+378.115499218" watchObservedRunningTime="2026-03-19 09:28:43.269892652 +0000 UTC m=+378.118491259" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.279429 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.357702 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.532062 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.595655 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.595885 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.784248 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.856651 4835 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 09:28:43 crc kubenswrapper[4835]: I0319 09:28:43.957181 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.040585 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.047587 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.262833 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.283177 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.317265 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.457056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.551403 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.699214 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.733402 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 09:28:44 crc kubenswrapper[4835]: I0319 09:28:44.811771 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 09:28:45 crc kubenswrapper[4835]: I0319 09:28:45.026394 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 09:28:45 crc kubenswrapper[4835]: I0319 09:28:45.536647 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 09:28:45 crc kubenswrapper[4835]: I0319 09:28:45.788332 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.000617 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.000703 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126140 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126244 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126349 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126363 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126408 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126459 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.126525 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.127124 4835 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.127165 4835 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.127183 4835 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.127238 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.135216 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.180559 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.213515 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-766799bf97-q8249"] Mar 19 09:28:46 crc kubenswrapper[4835]: E0319 09:28:46.213828 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.213846 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.213962 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.214418 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.221652 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.221710 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.222004 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.222178 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.222353 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.222690 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.223048 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.223106 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h"] Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.223068 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.223875 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.225003 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.225054 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.225215 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.225228 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.228290 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.228844 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.230506 4835 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.230563 4835 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.233524 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.233792 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.234147 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h"] Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.235294 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.235560 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.237531 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.242888 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.249369 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-766799bf97-q8249"] Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.256564 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.273861 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.273907 4835 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558" exitCode=137 Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.273949 4835 scope.go:117] "RemoveContainer" containerID="6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.274061 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.300406 4835 scope.go:117] "RemoveContainer" containerID="6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558" Mar 19 09:28:46 crc kubenswrapper[4835]: E0319 09:28:46.301085 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558\": container with ID starting with 6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558 not found: ID does not exist" containerID="6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.301159 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558"} err="failed to get container status \"6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558\": rpc error: code = NotFound desc = could not find container \"6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558\": container with ID starting with 6f82ee911bfa8920ef36b7dbb144d88a07f43656a65eb1e5bc0858bfc3e14558 not found: ID does not exist" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zr8g\" (UniqueName: \"kubernetes.io/projected/a3421bac-5e2c-494d-ba71-500a3ead9076-kube-api-access-8zr8g\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331783 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-cliconfig\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331835 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef156d19-8841-4be5-a739-bd07a7789ea3-serving-cert\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331872 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331946 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-policies\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.331999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-error\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.332032 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-dir\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.332064 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-config\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.332097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.332962 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-client-ca\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.332998 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-service-ca\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-router-certs\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-serving-cert\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333292 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-login\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333371 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4wk\" (UniqueName: \"kubernetes.io/projected/ef156d19-8841-4be5-a739-bd07a7789ea3-kube-api-access-tp4wk\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333432 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-session\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.333475 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.410124 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435243 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435328 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zr8g\" (UniqueName: \"kubernetes.io/projected/a3421bac-5e2c-494d-ba71-500a3ead9076-kube-api-access-8zr8g\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-cliconfig\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435411 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef156d19-8841-4be5-a739-bd07a7789ea3-serving-cert\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435445 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435490 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-policies\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435579 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-error\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435610 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-dir\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-config\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435678 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-client-ca\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-service-ca\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435826 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-router-certs\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435882 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-serving-cert\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435912 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-login\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.435970 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4wk\" (UniqueName: \"kubernetes.io/projected/ef156d19-8841-4be5-a739-bd07a7789ea3-kube-api-access-tp4wk\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.436021 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-session\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.436090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-dir\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.437194 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-service-ca\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.437226 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-cliconfig\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.437809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-client-ca\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.438115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-audit-policies\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.438290 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.439865 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef156d19-8841-4be5-a739-bd07a7789ea3-config\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.441564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef156d19-8841-4be5-a739-bd07a7789ea3-serving-cert\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.441710 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.443475 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.444575 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-session\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.444581 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-error\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.445090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.445849 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-user-template-login\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.446507 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-router-certs\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.450919 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a3421bac-5e2c-494d-ba71-500a3ead9076-v4-0-config-system-serving-cert\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.460478 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4wk\" (UniqueName: \"kubernetes.io/projected/ef156d19-8841-4be5-a739-bd07a7789ea3-kube-api-access-tp4wk\") pod \"route-controller-manager-5f9b46c45c-4586h\" (UID: \"ef156d19-8841-4be5-a739-bd07a7789ea3\") " pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.460653 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zr8g\" (UniqueName: \"kubernetes.io/projected/a3421bac-5e2c-494d-ba71-500a3ead9076-kube-api-access-8zr8g\") pod \"oauth-openshift-766799bf97-q8249\" (UID: \"a3421bac-5e2c-494d-ba71-500a3ead9076\") " pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.551085 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.558294 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:46 crc kubenswrapper[4835]: I0319 09:28:46.850408 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-766799bf97-q8249"] Mar 19 09:28:46 crc kubenswrapper[4835]: W0319 09:28:46.853568 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3421bac_5e2c_494d_ba71_500a3ead9076.slice/crio-baabf0cb2eaaad29e12297ced2af4c86a96d350ae4b89f46cbee5612265eead4 WatchSource:0}: Error finding container baabf0cb2eaaad29e12297ced2af4c86a96d350ae4b89f46cbee5612265eead4: Status 404 returned error can't find the container with id baabf0cb2eaaad29e12297ced2af4c86a96d350ae4b89f46cbee5612265eead4 Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.089543 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h"] Mar 19 09:28:47 crc kubenswrapper[4835]: W0319 09:28:47.102318 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef156d19_8841_4be5_a739_bd07a7789ea3.slice/crio-def7a46695cfa41fc654834552ec1af6c199a9f57cd1cf34a8185a15c331f6ce WatchSource:0}: Error finding container def7a46695cfa41fc654834552ec1af6c199a9f57cd1cf34a8185a15c331f6ce: Status 404 returned error can't find the container with id def7a46695cfa41fc654834552ec1af6c199a9f57cd1cf34a8185a15c331f6ce Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.281939 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" event={"ID":"ef156d19-8841-4be5-a739-bd07a7789ea3","Type":"ContainerStarted","Data":"d45f6fbc2f56800ba2896675356be0ba819dfd329f545ce7adcf9208ce4bacd8"} Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.281979 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" event={"ID":"ef156d19-8841-4be5-a739-bd07a7789ea3","Type":"ContainerStarted","Data":"def7a46695cfa41fc654834552ec1af6c199a9f57cd1cf34a8185a15c331f6ce"} Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.282149 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.283433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" event={"ID":"a3421bac-5e2c-494d-ba71-500a3ead9076","Type":"ContainerStarted","Data":"4835d12e1d2bb983d920ee435b8065501df1842902efa2c1dcccd8b0e3ab8339"} Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.283456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" event={"ID":"a3421bac-5e2c-494d-ba71-500a3ead9076","Type":"ContainerStarted","Data":"baabf0cb2eaaad29e12297ced2af4c86a96d350ae4b89f46cbee5612265eead4"} Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.283669 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.283699 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.283723 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.300281 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podStartSLOduration=50.300264725 podStartE2EDuration="50.300264725s" podCreationTimestamp="2026-03-19 09:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:47.297314146 +0000 UTC m=+382.145912733" watchObservedRunningTime="2026-03-19 09:28:47.300264725 +0000 UTC m=+382.148863312" Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.320333 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podStartSLOduration=57.320315604 podStartE2EDuration="57.320315604s" podCreationTimestamp="2026-03-19 09:27:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:28:47.320129369 +0000 UTC m=+382.168727966" watchObservedRunningTime="2026-03-19 09:28:47.320315604 +0000 UTC m=+382.168914191" Mar 19 09:28:47 crc kubenswrapper[4835]: I0319 09:28:47.394024 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 09:28:48 crc kubenswrapper[4835]: I0319 09:28:48.296066 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 09:29:08 crc kubenswrapper[4835]: I0319 09:29:08.435088 4835 generic.go:334] "Generic (PLEG): container finished" podID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerID="d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a" exitCode=0 Mar 19 09:29:08 crc kubenswrapper[4835]: I0319 09:29:08.435190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerDied","Data":"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a"} Mar 19 09:29:08 crc kubenswrapper[4835]: I0319 09:29:08.436197 4835 scope.go:117] "RemoveContainer" containerID="d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.458405 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerStarted","Data":"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0"} Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.459458 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.462128 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.464441 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.470355 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.471499 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.471585 4835 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f2efa3a47d25b8aa6684842dffbe85b4ff6415a97200a889a1924982a465c725" exitCode=137 Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.471636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f2efa3a47d25b8aa6684842dffbe85b4ff6415a97200a889a1924982a465c725"} Mar 19 09:29:09 crc kubenswrapper[4835]: I0319 09:29:09.471720 4835 scope.go:117] "RemoveContainer" containerID="be66eb2db774c7aed1b45b02b72e262c1d086c0c6ed9edd85d466aa0f15f1582" Mar 19 09:29:10 crc kubenswrapper[4835]: I0319 09:29:10.480788 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 09:29:10 crc kubenswrapper[4835]: I0319 09:29:10.483259 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 09:29:10 crc kubenswrapper[4835]: I0319 09:29:10.483435 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f7413a7189402da1d2a10dab1258250b092e33e94eaab8e47ebb2ba1c9c9eb8"} Mar 19 09:29:18 crc kubenswrapper[4835]: I0319 09:29:18.160574 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:29:18 crc kubenswrapper[4835]: I0319 09:29:18.854858 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:29:18 crc kubenswrapper[4835]: I0319 09:29:18.869807 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:29:19 crc kubenswrapper[4835]: I0319 09:29:19.663820 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.120048 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565208-9qhx9"] Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.123083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.125775 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.125849 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.126077 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.136174 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565208-9qhx9"] Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.275839 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s7hr\" (UniqueName: \"kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr\") pod \"auto-csr-approver-29565208-9qhx9\" (UID: \"83197613-5a22-46ec-8a50-6d6c3228b296\") " pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.377426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s7hr\" (UniqueName: \"kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr\") pod \"auto-csr-approver-29565208-9qhx9\" (UID: \"83197613-5a22-46ec-8a50-6d6c3228b296\") " pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.397325 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.410556 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.435577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s7hr\" (UniqueName: \"kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr\") pod \"auto-csr-approver-29565208-9qhx9\" (UID: \"83197613-5a22-46ec-8a50-6d6c3228b296\") " pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.451896 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.458068 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:26 crc kubenswrapper[4835]: I0319 09:29:26.854202 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565208-9qhx9"] Mar 19 09:29:27 crc kubenswrapper[4835]: I0319 09:29:27.713695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" event={"ID":"83197613-5a22-46ec-8a50-6d6c3228b296","Type":"ContainerStarted","Data":"f88f6767a92c6544da4dac2bc2a41c2a5aecb01ecadd6682ae25ffb48eb1b42e"} Mar 19 09:29:28 crc kubenswrapper[4835]: I0319 09:29:28.720708 4835 generic.go:334] "Generic (PLEG): container finished" podID="83197613-5a22-46ec-8a50-6d6c3228b296" containerID="d46be6ef8ac3fa0e2aeeff1fb29f9aeb82b6ffed9557e4acb6ce29be721d314f" exitCode=0 Mar 19 09:29:28 crc kubenswrapper[4835]: I0319 09:29:28.720796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" event={"ID":"83197613-5a22-46ec-8a50-6d6c3228b296","Type":"ContainerDied","Data":"d46be6ef8ac3fa0e2aeeff1fb29f9aeb82b6ffed9557e4acb6ce29be721d314f"} Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.047179 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.128368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s7hr\" (UniqueName: \"kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr\") pod \"83197613-5a22-46ec-8a50-6d6c3228b296\" (UID: \"83197613-5a22-46ec-8a50-6d6c3228b296\") " Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.137004 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr" (OuterVolumeSpecName: "kube-api-access-5s7hr") pod "83197613-5a22-46ec-8a50-6d6c3228b296" (UID: "83197613-5a22-46ec-8a50-6d6c3228b296"). InnerVolumeSpecName "kube-api-access-5s7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.229329 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s7hr\" (UniqueName: \"kubernetes.io/projected/83197613-5a22-46ec-8a50-6d6c3228b296-kube-api-access-5s7hr\") on node \"crc\" DevicePath \"\"" Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.732320 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" event={"ID":"83197613-5a22-46ec-8a50-6d6c3228b296","Type":"ContainerDied","Data":"f88f6767a92c6544da4dac2bc2a41c2a5aecb01ecadd6682ae25ffb48eb1b42e"} Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.732412 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88f6767a92c6544da4dac2bc2a41c2a5aecb01ecadd6682ae25ffb48eb1b42e" Mar 19 09:29:30 crc kubenswrapper[4835]: I0319 09:29:30.732360 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565208-9qhx9" Mar 19 09:29:36 crc kubenswrapper[4835]: I0319 09:29:36.422314 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:29:36 crc kubenswrapper[4835]: I0319 09:29:36.422977 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.140248 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4"] Mar 19 09:30:00 crc kubenswrapper[4835]: E0319 09:30:00.141262 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83197613-5a22-46ec-8a50-6d6c3228b296" containerName="oc" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.141283 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="83197613-5a22-46ec-8a50-6d6c3228b296" containerName="oc" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.141436 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="83197613-5a22-46ec-8a50-6d6c3228b296" containerName="oc" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.142036 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.145195 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.147318 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565210-crm8n"] Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.148493 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.155037 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.156295 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.156353 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.156368 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.159543 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4"] Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.162626 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565210-crm8n"] Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.237681 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtphn\" (UniqueName: \"kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn\") pod \"auto-csr-approver-29565210-crm8n\" (UID: \"80bbfdb2-82b5-40be-a7df-551fb2d03119\") " pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.238121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.339583 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.339892 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.341050 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtphn\" (UniqueName: \"kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn\") pod \"auto-csr-approver-29565210-crm8n\" (UID: \"80bbfdb2-82b5-40be-a7df-551fb2d03119\") " pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.341164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7j9h\" (UniqueName: \"kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.341798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.363969 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtphn\" (UniqueName: \"kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn\") pod \"auto-csr-approver-29565210-crm8n\" (UID: \"80bbfdb2-82b5-40be-a7df-551fb2d03119\") " pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.442584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.442682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7j9h\" (UniqueName: \"kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.449345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.463344 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7j9h\" (UniqueName: \"kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h\") pod \"collect-profiles-29565210-v9lr4\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.472381 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.491596 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.759279 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4"] Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.784673 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565210-crm8n"] Mar 19 09:30:00 crc kubenswrapper[4835]: W0319 09:30:00.788411 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80bbfdb2_82b5_40be_a7df_551fb2d03119.slice/crio-aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb WatchSource:0}: Error finding container aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb: Status 404 returned error can't find the container with id aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.939805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565210-crm8n" event={"ID":"80bbfdb2-82b5-40be-a7df-551fb2d03119","Type":"ContainerStarted","Data":"aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb"} Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.941798 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" event={"ID":"b4572ae2-65b7-406a-9113-9ac2447a00e5","Type":"ContainerStarted","Data":"2cf8203a278fd2fddce79a3dd6f22a01a7a8e460a75e0649b883480f9b12088f"} Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.941839 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" event={"ID":"b4572ae2-65b7-406a-9113-9ac2447a00e5","Type":"ContainerStarted","Data":"f3cfee24131b69365caddae6f83edf0e2ee51211c99fe052548728e88c015b03"} Mar 19 09:30:00 crc kubenswrapper[4835]: I0319 09:30:00.960170 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" podStartSLOduration=0.960131605 podStartE2EDuration="960.131605ms" podCreationTimestamp="2026-03-19 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:00.956475213 +0000 UTC m=+455.805073800" watchObservedRunningTime="2026-03-19 09:30:00.960131605 +0000 UTC m=+455.808730212" Mar 19 09:30:01 crc kubenswrapper[4835]: I0319 09:30:01.952126 4835 generic.go:334] "Generic (PLEG): container finished" podID="b4572ae2-65b7-406a-9113-9ac2447a00e5" containerID="2cf8203a278fd2fddce79a3dd6f22a01a7a8e460a75e0649b883480f9b12088f" exitCode=0 Mar 19 09:30:01 crc kubenswrapper[4835]: I0319 09:30:01.952237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" event={"ID":"b4572ae2-65b7-406a-9113-9ac2447a00e5","Type":"ContainerDied","Data":"2cf8203a278fd2fddce79a3dd6f22a01a7a8e460a75e0649b883480f9b12088f"} Mar 19 09:30:02 crc kubenswrapper[4835]: I0319 09:30:02.963513 4835 generic.go:334] "Generic (PLEG): container finished" podID="80bbfdb2-82b5-40be-a7df-551fb2d03119" containerID="30d47a6b8406ac16d60bcc10c7e20efa6771c67100f3c8357c5a9a2794ccdcba" exitCode=0 Mar 19 09:30:02 crc kubenswrapper[4835]: I0319 09:30:02.963586 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565210-crm8n" event={"ID":"80bbfdb2-82b5-40be-a7df-551fb2d03119","Type":"ContainerDied","Data":"30d47a6b8406ac16d60bcc10c7e20efa6771c67100f3c8357c5a9a2794ccdcba"} Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.209678 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.280690 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume\") pod \"b4572ae2-65b7-406a-9113-9ac2447a00e5\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.280777 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7j9h\" (UniqueName: \"kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h\") pod \"b4572ae2-65b7-406a-9113-9ac2447a00e5\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.280829 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume\") pod \"b4572ae2-65b7-406a-9113-9ac2447a00e5\" (UID: \"b4572ae2-65b7-406a-9113-9ac2447a00e5\") " Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.281937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume" (OuterVolumeSpecName: "config-volume") pod "b4572ae2-65b7-406a-9113-9ac2447a00e5" (UID: "b4572ae2-65b7-406a-9113-9ac2447a00e5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.285642 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b4572ae2-65b7-406a-9113-9ac2447a00e5" (UID: "b4572ae2-65b7-406a-9113-9ac2447a00e5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.285795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h" (OuterVolumeSpecName: "kube-api-access-r7j9h") pod "b4572ae2-65b7-406a-9113-9ac2447a00e5" (UID: "b4572ae2-65b7-406a-9113-9ac2447a00e5"). InnerVolumeSpecName "kube-api-access-r7j9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.381763 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4572ae2-65b7-406a-9113-9ac2447a00e5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.381815 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4572ae2-65b7-406a-9113-9ac2447a00e5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.381836 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7j9h\" (UniqueName: \"kubernetes.io/projected/b4572ae2-65b7-406a-9113-9ac2447a00e5-kube-api-access-r7j9h\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.971646 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.971630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4" event={"ID":"b4572ae2-65b7-406a-9113-9ac2447a00e5","Type":"ContainerDied","Data":"f3cfee24131b69365caddae6f83edf0e2ee51211c99fe052548728e88c015b03"} Mar 19 09:30:03 crc kubenswrapper[4835]: I0319 09:30:03.971784 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3cfee24131b69365caddae6f83edf0e2ee51211c99fe052548728e88c015b03" Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.324442 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.494542 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtphn\" (UniqueName: \"kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn\") pod \"80bbfdb2-82b5-40be-a7df-551fb2d03119\" (UID: \"80bbfdb2-82b5-40be-a7df-551fb2d03119\") " Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.499699 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn" (OuterVolumeSpecName: "kube-api-access-qtphn") pod "80bbfdb2-82b5-40be-a7df-551fb2d03119" (UID: "80bbfdb2-82b5-40be-a7df-551fb2d03119"). InnerVolumeSpecName "kube-api-access-qtphn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.596183 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtphn\" (UniqueName: \"kubernetes.io/projected/80bbfdb2-82b5-40be-a7df-551fb2d03119-kube-api-access-qtphn\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.980801 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565210-crm8n" event={"ID":"80bbfdb2-82b5-40be-a7df-551fb2d03119","Type":"ContainerDied","Data":"aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb"} Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.980883 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aadbe2cf0e472c3e8ef43b5fd71760252f323918a44078fb79abe4cff1c3b0eb" Mar 19 09:30:04 crc kubenswrapper[4835]: I0319 09:30:04.980834 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565210-crm8n" Mar 19 09:30:05 crc kubenswrapper[4835]: I0319 09:30:05.387752 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565204-mwv2v"] Mar 19 09:30:05 crc kubenswrapper[4835]: I0319 09:30:05.395778 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565204-mwv2v"] Mar 19 09:30:06 crc kubenswrapper[4835]: I0319 09:30:06.414982 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930d85cd-ba00-4c27-b728-dbdeaab91ca5" path="/var/lib/kubelet/pods/930d85cd-ba00-4c27-b728-dbdeaab91ca5/volumes" Mar 19 09:30:06 crc kubenswrapper[4835]: I0319 09:30:06.422802 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:30:06 crc kubenswrapper[4835]: I0319 09:30:06.422883 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:30:13 crc kubenswrapper[4835]: I0319 09:30:13.836666 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:30:13 crc kubenswrapper[4835]: I0319 09:30:13.837330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:30:13 crc kubenswrapper[4835]: I0319 09:30:13.838466 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:30:13 crc kubenswrapper[4835]: I0319 09:30:13.846054 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.102791 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.748101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.748634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.751789 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.752324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.803177 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:30:14 crc kubenswrapper[4835]: I0319 09:30:14.805199 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 09:30:15 crc kubenswrapper[4835]: I0319 09:30:15.046522 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"be8fc35eb85fb9ef792b59318e4ef724f2a96a6629cc933712d29a90158b4324"} Mar 19 09:30:15 crc kubenswrapper[4835]: I0319 09:30:15.046933 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"22a740d185572c809b9da058e283fbb0ffa56b5ff0fda3cba7056cd9823a41fc"} Mar 19 09:30:15 crc kubenswrapper[4835]: W0319 09:30:15.225312 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-331a97993d6c7061ce2a946e46ca7aefd5db7bfb85dfadf3ec7b48c68e2c723e WatchSource:0}: Error finding container 331a97993d6c7061ce2a946e46ca7aefd5db7bfb85dfadf3ec7b48c68e2c723e: Status 404 returned error can't find the container with id 331a97993d6c7061ce2a946e46ca7aefd5db7bfb85dfadf3ec7b48c68e2c723e Mar 19 09:30:16 crc kubenswrapper[4835]: I0319 09:30:16.066958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b0c3ee965c1d381f943d07800a9b54c2b91668ef811d7486222bde03dfa69b2a"} Mar 19 09:30:16 crc kubenswrapper[4835]: I0319 09:30:16.067354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"331a97993d6c7061ce2a946e46ca7aefd5db7bfb85dfadf3ec7b48c68e2c723e"} Mar 19 09:30:16 crc kubenswrapper[4835]: I0319 09:30:16.067589 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:30:16 crc kubenswrapper[4835]: I0319 09:30:16.070095 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a12550fe319cfe811a1115df32d31e63e6af3084d1f02c5c3cceafcd6e8b953c"} Mar 19 09:30:16 crc kubenswrapper[4835]: I0319 09:30:16.070126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8a37edb5f41b21fb05c42ce4ba989726aa4f889f9dd11086bca025f6f5fbe3d5"} Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.001972 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.002944 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zgq9n" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="registry-server" containerID="cri-o://5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d" gracePeriod=30 Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.016819 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.017299 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ds4dj" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="registry-server" containerID="cri-o://bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94" gracePeriod=30 Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.032538 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.033573 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" containerID="cri-o://2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0" gracePeriod=30 Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.046826 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.047150 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hk7rv" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="registry-server" containerID="cri-o://1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7" gracePeriod=30 Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.053079 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.053316 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-65z7x" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="registry-server" containerID="cri-o://eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" gracePeriod=30 Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.084906 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bzw9"] Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.085213 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4572ae2-65b7-406a-9113-9ac2447a00e5" containerName="collect-profiles" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.085228 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4572ae2-65b7-406a-9113-9ac2447a00e5" containerName="collect-profiles" Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.085242 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bbfdb2-82b5-40be-a7df-551fb2d03119" containerName="oc" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.085251 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bbfdb2-82b5-40be-a7df-551fb2d03119" containerName="oc" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.085378 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4572ae2-65b7-406a-9113-9ac2447a00e5" containerName="collect-profiles" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.085403 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bbfdb2-82b5-40be-a7df-551fb2d03119" containerName="oc" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.085883 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.091358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bzw9"] Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.120688 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfvx6\" (UniqueName: \"kubernetes.io/projected/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-kube-api-access-zfvx6\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.120787 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.120816 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.182471 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 is running failed: container process not found" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.182986 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 is running failed: container process not found" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.183415 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 is running failed: container process not found" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 09:30:20 crc kubenswrapper[4835]: E0319 09:30:20.183540 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-65z7x" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="registry-server" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.221870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfvx6\" (UniqueName: \"kubernetes.io/projected/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-kube-api-access-zfvx6\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.221933 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.221969 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.223363 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.234469 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.237773 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfvx6\" (UniqueName: \"kubernetes.io/projected/dd4d4255-8149-4b0e-b3f2-dc0951a043a5-kube-api-access-zfvx6\") pod \"marketplace-operator-79b997595-4bzw9\" (UID: \"dd4d4255-8149-4b0e-b3f2-dc0951a043a5\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.501243 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.505117 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.512524 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.517419 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.526206 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content\") pod \"ae7f5216-917f-4f78-925b-3d53dc945bdd\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.526303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmpts\" (UniqueName: \"kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts\") pod \"ae7f5216-917f-4f78-925b-3d53dc945bdd\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.526341 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities\") pod \"ae7f5216-917f-4f78-925b-3d53dc945bdd\" (UID: \"ae7f5216-917f-4f78-925b-3d53dc945bdd\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.528605 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities" (OuterVolumeSpecName: "utilities") pod "ae7f5216-917f-4f78-925b-3d53dc945bdd" (UID: "ae7f5216-917f-4f78-925b-3d53dc945bdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.532038 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts" (OuterVolumeSpecName: "kube-api-access-wmpts") pod "ae7f5216-917f-4f78-925b-3d53dc945bdd" (UID: "ae7f5216-917f-4f78-925b-3d53dc945bdd"). InnerVolumeSpecName "kube-api-access-wmpts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.532623 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.539388 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627625 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt5lg\" (UniqueName: \"kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg\") pod \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627672 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content\") pod \"35072c46-2bb8-4be2-9409-8780c5fa5717\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627707 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content\") pod \"f1a86135-77ed-4bdf-874d-0b141bff59bb\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627732 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities\") pod \"35072c46-2bb8-4be2-9409-8780c5fa5717\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca\") pod \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627810 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmbh7\" (UniqueName: \"kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7\") pod \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627837 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content\") pod \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627853 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26jz8\" (UniqueName: \"kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8\") pod \"f1a86135-77ed-4bdf-874d-0b141bff59bb\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627877 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities\") pod \"f1a86135-77ed-4bdf-874d-0b141bff59bb\" (UID: \"f1a86135-77ed-4bdf-874d-0b141bff59bb\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627923 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities\") pod \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\" (UID: \"2d5a1f24-78ad-4c43-bf96-47e2e23c1996\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627941 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics\") pod \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\" (UID: \"1e9d6532-5a6a-4c10-90ac-92bcef610d29\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.627955 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gmkc\" (UniqueName: \"kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc\") pod \"35072c46-2bb8-4be2-9409-8780c5fa5717\" (UID: \"35072c46-2bb8-4be2-9409-8780c5fa5717\") " Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.628130 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmpts\" (UniqueName: \"kubernetes.io/projected/ae7f5216-917f-4f78-925b-3d53dc945bdd-kube-api-access-wmpts\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.628140 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.629347 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities" (OuterVolumeSpecName: "utilities") pod "35072c46-2bb8-4be2-9409-8780c5fa5717" (UID: "35072c46-2bb8-4be2-9409-8780c5fa5717"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.629633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities" (OuterVolumeSpecName: "utilities") pod "2d5a1f24-78ad-4c43-bf96-47e2e23c1996" (UID: "2d5a1f24-78ad-4c43-bf96-47e2e23c1996"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.629669 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1e9d6532-5a6a-4c10-90ac-92bcef610d29" (UID: "1e9d6532-5a6a-4c10-90ac-92bcef610d29"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.629853 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities" (OuterVolumeSpecName: "utilities") pod "f1a86135-77ed-4bdf-874d-0b141bff59bb" (UID: "f1a86135-77ed-4bdf-874d-0b141bff59bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.632386 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg" (OuterVolumeSpecName: "kube-api-access-qt5lg") pod "2d5a1f24-78ad-4c43-bf96-47e2e23c1996" (UID: "2d5a1f24-78ad-4c43-bf96-47e2e23c1996"). InnerVolumeSpecName "kube-api-access-qt5lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.632552 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae7f5216-917f-4f78-925b-3d53dc945bdd" (UID: "ae7f5216-917f-4f78-925b-3d53dc945bdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.635558 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7" (OuterVolumeSpecName: "kube-api-access-jmbh7") pod "1e9d6532-5a6a-4c10-90ac-92bcef610d29" (UID: "1e9d6532-5a6a-4c10-90ac-92bcef610d29"). InnerVolumeSpecName "kube-api-access-jmbh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.645598 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8" (OuterVolumeSpecName: "kube-api-access-26jz8") pod "f1a86135-77ed-4bdf-874d-0b141bff59bb" (UID: "f1a86135-77ed-4bdf-874d-0b141bff59bb"). InnerVolumeSpecName "kube-api-access-26jz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.645832 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1e9d6532-5a6a-4c10-90ac-92bcef610d29" (UID: "1e9d6532-5a6a-4c10-90ac-92bcef610d29"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.649051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc" (OuterVolumeSpecName: "kube-api-access-5gmkc") pod "35072c46-2bb8-4be2-9409-8780c5fa5717" (UID: "35072c46-2bb8-4be2-9409-8780c5fa5717"). InnerVolumeSpecName "kube-api-access-5gmkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.656622 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d5a1f24-78ad-4c43-bf96-47e2e23c1996" (UID: "2d5a1f24-78ad-4c43-bf96-47e2e23c1996"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.703393 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1a86135-77ed-4bdf-874d-0b141bff59bb" (UID: "f1a86135-77ed-4bdf-874d-0b141bff59bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729077 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729100 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729111 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729122 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gmkc\" (UniqueName: \"kubernetes.io/projected/35072c46-2bb8-4be2-9409-8780c5fa5717-kube-api-access-5gmkc\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729133 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt5lg\" (UniqueName: \"kubernetes.io/projected/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-kube-api-access-qt5lg\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729141 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae7f5216-917f-4f78-925b-3d53dc945bdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729149 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a86135-77ed-4bdf-874d-0b141bff59bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729158 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729168 4835 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e9d6532-5a6a-4c10-90ac-92bcef610d29-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729178 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmbh7\" (UniqueName: \"kubernetes.io/projected/1e9d6532-5a6a-4c10-90ac-92bcef610d29-kube-api-access-jmbh7\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729186 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26jz8\" (UniqueName: \"kubernetes.io/projected/f1a86135-77ed-4bdf-874d-0b141bff59bb-kube-api-access-26jz8\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.729194 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d5a1f24-78ad-4c43-bf96-47e2e23c1996-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.801608 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35072c46-2bb8-4be2-9409-8780c5fa5717" (UID: "35072c46-2bb8-4be2-9409-8780c5fa5717"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.830526 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35072c46-2bb8-4be2-9409-8780c5fa5717-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:30:20 crc kubenswrapper[4835]: I0319 09:30:20.930024 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bzw9"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.102473 4835 generic.go:334] "Generic (PLEG): container finished" podID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerID="2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0" exitCode=0 Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.102548 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerDied","Data":"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.102860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" event={"ID":"1e9d6532-5a6a-4c10-90ac-92bcef610d29","Type":"ContainerDied","Data":"338c991fbd786e08f8ca13ec05f2f3669d3445184d487199af83b366b79660c3"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.102889 4835 scope.go:117] "RemoveContainer" containerID="2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.102590 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7sdz9" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.108137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" event={"ID":"dd4d4255-8149-4b0e-b3f2-dc0951a043a5","Type":"ContainerStarted","Data":"e14965fe103b6b5367faa035bf9e70accc869275771525030bb50caff305ced2"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.108186 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" event={"ID":"dd4d4255-8149-4b0e-b3f2-dc0951a043a5","Type":"ContainerStarted","Data":"66ef3d463ab9734749f0da3bfdc7b8217fb28f64619777734d258df1435aa25a"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.108557 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.109648 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.109678 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.113309 4835 generic.go:334] "Generic (PLEG): container finished" podID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" exitCode=0 Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.113369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerDied","Data":"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.113387 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65z7x" event={"ID":"35072c46-2bb8-4be2-9409-8780c5fa5717","Type":"ContainerDied","Data":"127dd30f2ec3344f70f2c283dd505f0c065a47ff386fe0849f7a19580dadf529"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.113457 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65z7x" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.124574 4835 scope.go:117] "RemoveContainer" containerID="d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.125525 4835 generic.go:334] "Generic (PLEG): container finished" podID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerID="5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d" exitCode=0 Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.125594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerDied","Data":"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.125630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgq9n" event={"ID":"ae7f5216-917f-4f78-925b-3d53dc945bdd","Type":"ContainerDied","Data":"93cf176b7be290f4be4cc3d398c7a1e06281fd5637a121c6e052025a999f497a"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.125719 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgq9n" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.127362 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podStartSLOduration=1.127348599 podStartE2EDuration="1.127348599s" podCreationTimestamp="2026-03-19 09:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:30:21.125019511 +0000 UTC m=+475.973618108" watchObservedRunningTime="2026-03-19 09:30:21.127348599 +0000 UTC m=+475.975947186" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.141596 4835 generic.go:334] "Generic (PLEG): container finished" podID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerID="bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94" exitCode=0 Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.141748 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ds4dj" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.142467 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerDied","Data":"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.142524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ds4dj" event={"ID":"f1a86135-77ed-4bdf-874d-0b141bff59bb","Type":"ContainerDied","Data":"5093aa0ef2691d4d4e59acaf51178b54ec7ada7a910e3482902627d95f0ece6a"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.144847 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.147558 4835 generic.go:334] "Generic (PLEG): container finished" podID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerID="1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7" exitCode=0 Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.147581 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerDied","Data":"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.147605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hk7rv" event={"ID":"2d5a1f24-78ad-4c43-bf96-47e2e23c1996","Type":"ContainerDied","Data":"60e2f9a16777ca3a25bd9e83840baf6f3ea9cd278c10639b1e1252ea7e65d5a8"} Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.147631 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hk7rv" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.148518 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7sdz9"] Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.176337 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9d6532_5a6a_4c10_90ac_92bcef610d29.slice\": RecentStats: unable to find data in memory cache]" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.181376 4835 scope.go:117] "RemoveContainer" containerID="2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.182053 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0\": container with ID starting with 2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0 not found: ID does not exist" containerID="2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.182096 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0"} err="failed to get container status \"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0\": rpc error: code = NotFound desc = could not find container \"2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0\": container with ID starting with 2c8f44409d758bedf2389b6c9bacb49f55c928aac5d13053d6c5d114fb87b6d0 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.182120 4835 scope.go:117] "RemoveContainer" containerID="d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.182690 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a\": container with ID starting with d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a not found: ID does not exist" containerID="d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.182731 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a"} err="failed to get container status \"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a\": rpc error: code = NotFound desc = could not find container \"d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a\": container with ID starting with d53d0eeddc52c78fdc2e3585cc37312c0b570515a96bbd84d50495caff3c437a not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.182775 4835 scope.go:117] "RemoveContainer" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.213007 4835 scope.go:117] "RemoveContainer" containerID="165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.215212 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.226279 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-65z7x"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.232857 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.235832 4835 scope.go:117] "RemoveContainer" containerID="87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.238235 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ds4dj"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.247945 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.254287 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hk7rv"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.257635 4835 scope.go:117] "RemoveContainer" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.258117 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439\": container with ID starting with eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 not found: ID does not exist" containerID="eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.258157 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439"} err="failed to get container status \"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439\": rpc error: code = NotFound desc = could not find container \"eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439\": container with ID starting with eb23950c6fcf76c218d03032a78defbd91a6d9752c3ae63fcbf56c68af79a439 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.258186 4835 scope.go:117] "RemoveContainer" containerID="165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.258664 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9\": container with ID starting with 165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9 not found: ID does not exist" containerID="165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.258708 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9"} err="failed to get container status \"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9\": rpc error: code = NotFound desc = could not find container \"165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9\": container with ID starting with 165342f6be685fa7b91594bd871da387963a7ecf6bdbc50a11f14fdf0a8496b9 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.258772 4835 scope.go:117] "RemoveContainer" containerID="87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.259090 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b\": container with ID starting with 87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b not found: ID does not exist" containerID="87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.259127 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b"} err="failed to get container status \"87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b\": rpc error: code = NotFound desc = could not find container \"87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b\": container with ID starting with 87ed7e16cbe4fa3c192a1a418ea3f4e8843c794a19940be32e7cf5297acc090b not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.259149 4835 scope.go:117] "RemoveContainer" containerID="5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.259937 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.269179 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zgq9n"] Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.292104 4835 scope.go:117] "RemoveContainer" containerID="0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.306591 4835 scope.go:117] "RemoveContainer" containerID="da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.325224 4835 scope.go:117] "RemoveContainer" containerID="5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.326574 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d\": container with ID starting with 5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d not found: ID does not exist" containerID="5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.326608 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d"} err="failed to get container status \"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d\": rpc error: code = NotFound desc = could not find container \"5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d\": container with ID starting with 5d9b5f5c796c6c757e20615c61bbc48fc9a2dcf67e4b1aca01e090a734bf855d not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.326638 4835 scope.go:117] "RemoveContainer" containerID="0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.327069 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e\": container with ID starting with 0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e not found: ID does not exist" containerID="0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.327099 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e"} err="failed to get container status \"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e\": rpc error: code = NotFound desc = could not find container \"0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e\": container with ID starting with 0389523a75ed446c16f9b3cd234caf928c4d2c1689a71dcd0e02adc13589029e not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.327118 4835 scope.go:117] "RemoveContainer" containerID="da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.327771 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f\": container with ID starting with da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f not found: ID does not exist" containerID="da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.327798 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f"} err="failed to get container status \"da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f\": rpc error: code = NotFound desc = could not find container \"da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f\": container with ID starting with da325e9c3ddb1c3e7d70dd909c01aafd3df1986d74e9ca5c1005e705df45b63f not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.327815 4835 scope.go:117] "RemoveContainer" containerID="bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.342187 4835 scope.go:117] "RemoveContainer" containerID="31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.356915 4835 scope.go:117] "RemoveContainer" containerID="8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.371907 4835 scope.go:117] "RemoveContainer" containerID="bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.372404 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94\": container with ID starting with bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94 not found: ID does not exist" containerID="bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.372435 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94"} err="failed to get container status \"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94\": rpc error: code = NotFound desc = could not find container \"bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94\": container with ID starting with bceb0df4f0ff5c3cd394a6fb9021c0878cf95f910d355655976181076d286d94 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.372459 4835 scope.go:117] "RemoveContainer" containerID="31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.372837 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790\": container with ID starting with 31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790 not found: ID does not exist" containerID="31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.372858 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790"} err="failed to get container status \"31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790\": rpc error: code = NotFound desc = could not find container \"31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790\": container with ID starting with 31603231dbf4226cced0fd1deed512b572edab30fdf4beb649d4cf055e1a0790 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.372871 4835 scope.go:117] "RemoveContainer" containerID="8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.373071 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f\": container with ID starting with 8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f not found: ID does not exist" containerID="8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.373093 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f"} err="failed to get container status \"8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f\": rpc error: code = NotFound desc = could not find container \"8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f\": container with ID starting with 8155e80695967bdf797fb61c232c6326be152d9698a173cfc5bf4c4398d1ef1f not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.373106 4835 scope.go:117] "RemoveContainer" containerID="1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.393611 4835 scope.go:117] "RemoveContainer" containerID="78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.414639 4835 scope.go:117] "RemoveContainer" containerID="5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.429595 4835 scope.go:117] "RemoveContainer" containerID="1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.430049 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7\": container with ID starting with 1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7 not found: ID does not exist" containerID="1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.430111 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7"} err="failed to get container status \"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7\": rpc error: code = NotFound desc = could not find container \"1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7\": container with ID starting with 1b0e1bc835693f869857e64ed1b8955657719d3ddf054fa52d8455358e58fae7 not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.430188 4835 scope.go:117] "RemoveContainer" containerID="78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.430561 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd\": container with ID starting with 78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd not found: ID does not exist" containerID="78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.430610 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd"} err="failed to get container status \"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd\": rpc error: code = NotFound desc = could not find container \"78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd\": container with ID starting with 78d55ab0edafef315835a2b90118a5819e968992ac6dd6fc70d7385a14f480fd not found: ID does not exist" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.430630 4835 scope.go:117] "RemoveContainer" containerID="5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74" Mar 19 09:30:21 crc kubenswrapper[4835]: E0319 09:30:21.430892 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74\": container with ID starting with 5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74 not found: ID does not exist" containerID="5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74" Mar 19 09:30:21 crc kubenswrapper[4835]: I0319 09:30:21.430944 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74"} err="failed to get container status \"5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74\": rpc error: code = NotFound desc = could not find container \"5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74\": container with ID starting with 5caf1a3cc7de0dfe65c12dbf06257a17b66567fe8a4327b5c66283101ef15b74 not found: ID does not exist" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.161458 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.416454 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" path="/var/lib/kubelet/pods/1e9d6532-5a6a-4c10-90ac-92bcef610d29/volumes" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.417008 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" path="/var/lib/kubelet/pods/2d5a1f24-78ad-4c43-bf96-47e2e23c1996/volumes" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.417679 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" path="/var/lib/kubelet/pods/35072c46-2bb8-4be2-9409-8780c5fa5717/volumes" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.419038 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" path="/var/lib/kubelet/pods/ae7f5216-917f-4f78-925b-3d53dc945bdd/volumes" Mar 19 09:30:22 crc kubenswrapper[4835]: I0319 09:30:22.419876 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" path="/var/lib/kubelet/pods/f1a86135-77ed-4bdf-874d-0b141bff59bb/volumes" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.237927 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l2vsx"] Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239062 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239096 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239117 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239130 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239207 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239239 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239259 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239265 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239273 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239286 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239305 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239315 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239328 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239334 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239349 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239359 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239373 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239379 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239396 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239401 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239414 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239419 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="extract-content" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239426 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239432 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239445 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239451 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="extract-utilities" Mar 19 09:30:23 crc kubenswrapper[4835]: E0319 09:30:23.239457 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239463 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239761 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="35072c46-2bb8-4be2-9409-8780c5fa5717" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239782 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239794 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9d6532-5a6a-4c10-90ac-92bcef610d29" containerName="marketplace-operator" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239815 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d5a1f24-78ad-4c43-bf96-47e2e23c1996" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239823 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7f5216-917f-4f78-925b-3d53dc945bdd" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.239833 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a86135-77ed-4bdf-874d-0b141bff59bb" containerName="registry-server" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.241144 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.243019 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.245617 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2vsx"] Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.290782 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2hj\" (UniqueName: \"kubernetes.io/projected/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-kube-api-access-zh2hj\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.296080 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-utilities\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.296130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-catalog-content\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.397364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-utilities\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.397417 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-catalog-content\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.397452 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2hj\" (UniqueName: \"kubernetes.io/projected/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-kube-api-access-zh2hj\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.398008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-utilities\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.398094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-catalog-content\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.415026 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.416130 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.422016 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.430155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2hj\" (UniqueName: \"kubernetes.io/projected/5d586c5b-694e-4ac9-aa09-0d973cdad7e0-kube-api-access-zh2hj\") pod \"redhat-marketplace-l2vsx\" (UID: \"5d586c5b-694e-4ac9-aa09-0d973cdad7e0\") " pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.431879 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.498979 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.499086 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.499128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966dh\" (UniqueName: \"kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.600017 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.600071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966dh\" (UniqueName: \"kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.600123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.600695 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.600700 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.612928 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.616286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966dh\" (UniqueName: \"kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh\") pod \"redhat-operators-ppf8q\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.755440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:23 crc kubenswrapper[4835]: I0319 09:30:23.949594 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 09:30:23 crc kubenswrapper[4835]: W0319 09:30:23.960891 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55cc73b7_e015_4881_81d5_075426bff93c.slice/crio-399ca121cf93a0c958e61f834604e7ec6d1c4989940d2fdb7d934c632bb0e879 WatchSource:0}: Error finding container 399ca121cf93a0c958e61f834604e7ec6d1c4989940d2fdb7d934c632bb0e879: Status 404 returned error can't find the container with id 399ca121cf93a0c958e61f834604e7ec6d1c4989940d2fdb7d934c632bb0e879 Mar 19 09:30:24 crc kubenswrapper[4835]: I0319 09:30:24.037787 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l2vsx"] Mar 19 09:30:24 crc kubenswrapper[4835]: W0319 09:30:24.051624 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d586c5b_694e_4ac9_aa09_0d973cdad7e0.slice/crio-3fedaca89a1d416d33dc21789116e3555eff48357b94b3bff4a680e5b62f89ba WatchSource:0}: Error finding container 3fedaca89a1d416d33dc21789116e3555eff48357b94b3bff4a680e5b62f89ba: Status 404 returned error can't find the container with id 3fedaca89a1d416d33dc21789116e3555eff48357b94b3bff4a680e5b62f89ba Mar 19 09:30:24 crc kubenswrapper[4835]: I0319 09:30:24.173183 4835 generic.go:334] "Generic (PLEG): container finished" podID="55cc73b7-e015-4881-81d5-075426bff93c" containerID="1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224" exitCode=0 Mar 19 09:30:24 crc kubenswrapper[4835]: I0319 09:30:24.173345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerDied","Data":"1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224"} Mar 19 09:30:24 crc kubenswrapper[4835]: I0319 09:30:24.173592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerStarted","Data":"399ca121cf93a0c958e61f834604e7ec6d1c4989940d2fdb7d934c632bb0e879"} Mar 19 09:30:24 crc kubenswrapper[4835]: I0319 09:30:24.179714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerStarted","Data":"3fedaca89a1d416d33dc21789116e3555eff48357b94b3bff4a680e5b62f89ba"} Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.188715 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerStarted","Data":"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7"} Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.191940 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerID="4979b96bb13071630f5dc8448a2471b76f8e1ce8d53c685b6ed3967494eee878" exitCode=0 Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.191987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerDied","Data":"4979b96bb13071630f5dc8448a2471b76f8e1ce8d53c685b6ed3967494eee878"} Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.621064 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.622705 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.629472 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.642302 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.724827 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.724905 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9xg\" (UniqueName: \"kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.724945 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.817440 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bmh95"] Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.818512 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.820949 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.824296 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmh95"] Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.831436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.831495 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9xg\" (UniqueName: \"kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.831544 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.832206 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.832222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.862658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9xg\" (UniqueName: \"kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg\") pod \"community-operators-rnqsm\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.932401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-catalog-content\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.932449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sm2\" (UniqueName: \"kubernetes.io/projected/8b78cdf3-88ba-4ab2-9966-492863d9206c-kube-api-access-42sm2\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:25 crc kubenswrapper[4835]: I0319 09:30:25.932469 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-utilities\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.033445 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sm2\" (UniqueName: \"kubernetes.io/projected/8b78cdf3-88ba-4ab2-9966-492863d9206c-kube-api-access-42sm2\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.033495 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-utilities\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.033563 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-catalog-content\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.033997 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-catalog-content\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.034225 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b78cdf3-88ba-4ab2-9966-492863d9206c-utilities\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.056518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.067927 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sm2\" (UniqueName: \"kubernetes.io/projected/8b78cdf3-88ba-4ab2-9966-492863d9206c-kube-api-access-42sm2\") pod \"certified-operators-bmh95\" (UID: \"8b78cdf3-88ba-4ab2-9966-492863d9206c\") " pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.144907 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.197593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerStarted","Data":"63b54f132e64a5fd475570bbd026f867b12172051885fe116b5aab93bb430afd"} Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.209485 4835 generic.go:334] "Generic (PLEG): container finished" podID="55cc73b7-e015-4881-81d5-075426bff93c" containerID="7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7" exitCode=0 Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.209533 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerDied","Data":"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7"} Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.348311 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bmh95"] Mar 19 09:30:26 crc kubenswrapper[4835]: W0319 09:30:26.355166 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b78cdf3_88ba_4ab2_9966_492863d9206c.slice/crio-b0392660999d09994405aabdd32fbdf6830a858589c0ae6d8bfe4be5d442e920 WatchSource:0}: Error finding container b0392660999d09994405aabdd32fbdf6830a858589c0ae6d8bfe4be5d442e920: Status 404 returned error can't find the container with id b0392660999d09994405aabdd32fbdf6830a858589c0ae6d8bfe4be5d442e920 Mar 19 09:30:26 crc kubenswrapper[4835]: I0319 09:30:26.489442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 09:30:26 crc kubenswrapper[4835]: W0319 09:30:26.551209 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82216f27_30b5_4c41_a5d1_a0523d8c2aca.slice/crio-19f63dcfc145b659c0b098fb09f56e395a69cb54dfb1f084c8deeca4b8a3cd83 WatchSource:0}: Error finding container 19f63dcfc145b659c0b098fb09f56e395a69cb54dfb1f084c8deeca4b8a3cd83: Status 404 returned error can't find the container with id 19f63dcfc145b659c0b098fb09f56e395a69cb54dfb1f084c8deeca4b8a3cd83 Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.218298 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerID="63b54f132e64a5fd475570bbd026f867b12172051885fe116b5aab93bb430afd" exitCode=0 Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.218396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerDied","Data":"63b54f132e64a5fd475570bbd026f867b12172051885fe116b5aab93bb430afd"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.220925 4835 generic.go:334] "Generic (PLEG): container finished" podID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerID="d216a2bafb40bda1f22a08a3b6d7e36d8701547e0866a7ebd8c6af9df00c2a8d" exitCode=0 Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.220967 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerDied","Data":"d216a2bafb40bda1f22a08a3b6d7e36d8701547e0866a7ebd8c6af9df00c2a8d"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.221000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerStarted","Data":"b0392660999d09994405aabdd32fbdf6830a858589c0ae6d8bfe4be5d442e920"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.223921 4835 generic.go:334] "Generic (PLEG): container finished" podID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerID="cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869" exitCode=0 Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.224005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerDied","Data":"cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.224035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerStarted","Data":"19f63dcfc145b659c0b098fb09f56e395a69cb54dfb1f084c8deeca4b8a3cd83"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.232455 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerStarted","Data":"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539"} Mar 19 09:30:27 crc kubenswrapper[4835]: I0319 09:30:27.301558 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppf8q" podStartSLOduration=1.824929451 podStartE2EDuration="4.30154255s" podCreationTimestamp="2026-03-19 09:30:23 +0000 UTC" firstStartedPulling="2026-03-19 09:30:24.175755923 +0000 UTC m=+479.024354510" lastFinishedPulling="2026-03-19 09:30:26.652369002 +0000 UTC m=+481.500967609" observedRunningTime="2026-03-19 09:30:27.298778671 +0000 UTC m=+482.147377258" watchObservedRunningTime="2026-03-19 09:30:27.30154255 +0000 UTC m=+482.150141137" Mar 19 09:30:28 crc kubenswrapper[4835]: I0319 09:30:28.238633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerStarted","Data":"1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975"} Mar 19 09:30:28 crc kubenswrapper[4835]: I0319 09:30:28.240632 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerStarted","Data":"ba44251209d7b48ef426bd53b8fdb73e27eabef1e01f3db57b29665622198919"} Mar 19 09:30:28 crc kubenswrapper[4835]: I0319 09:30:28.242591 4835 generic.go:334] "Generic (PLEG): container finished" podID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerID="3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305" exitCode=0 Mar 19 09:30:28 crc kubenswrapper[4835]: I0319 09:30:28.243349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerDied","Data":"3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305"} Mar 19 09:30:28 crc kubenswrapper[4835]: I0319 09:30:28.256999 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l2vsx" podStartSLOduration=2.8464771300000002 podStartE2EDuration="5.256981678s" podCreationTimestamp="2026-03-19 09:30:23 +0000 UTC" firstStartedPulling="2026-03-19 09:30:25.193481337 +0000 UTC m=+480.042079964" lastFinishedPulling="2026-03-19 09:30:27.603985895 +0000 UTC m=+482.452584512" observedRunningTime="2026-03-19 09:30:28.256166777 +0000 UTC m=+483.104765374" watchObservedRunningTime="2026-03-19 09:30:28.256981678 +0000 UTC m=+483.105580265" Mar 19 09:30:29 crc kubenswrapper[4835]: I0319 09:30:29.250330 4835 generic.go:334] "Generic (PLEG): container finished" podID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerID="ba44251209d7b48ef426bd53b8fdb73e27eabef1e01f3db57b29665622198919" exitCode=0 Mar 19 09:30:29 crc kubenswrapper[4835]: I0319 09:30:29.250451 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerDied","Data":"ba44251209d7b48ef426bd53b8fdb73e27eabef1e01f3db57b29665622198919"} Mar 19 09:30:29 crc kubenswrapper[4835]: I0319 09:30:29.254859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerStarted","Data":"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583"} Mar 19 09:30:29 crc kubenswrapper[4835]: I0319 09:30:29.299282 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rnqsm" podStartSLOduration=2.87165474 podStartE2EDuration="4.299263735s" podCreationTimestamp="2026-03-19 09:30:25 +0000 UTC" firstStartedPulling="2026-03-19 09:30:27.229450918 +0000 UTC m=+482.078049505" lastFinishedPulling="2026-03-19 09:30:28.657059873 +0000 UTC m=+483.505658500" observedRunningTime="2026-03-19 09:30:29.298300372 +0000 UTC m=+484.146898999" watchObservedRunningTime="2026-03-19 09:30:29.299263735 +0000 UTC m=+484.147862322" Mar 19 09:30:30 crc kubenswrapper[4835]: I0319 09:30:30.261858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerStarted","Data":"c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd"} Mar 19 09:30:30 crc kubenswrapper[4835]: I0319 09:30:30.281944 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bmh95" podStartSLOduration=2.841337375 podStartE2EDuration="5.281926104s" podCreationTimestamp="2026-03-19 09:30:25 +0000 UTC" firstStartedPulling="2026-03-19 09:30:27.223313215 +0000 UTC m=+482.071911852" lastFinishedPulling="2026-03-19 09:30:29.663901984 +0000 UTC m=+484.512500581" observedRunningTime="2026-03-19 09:30:30.279491713 +0000 UTC m=+485.128090310" watchObservedRunningTime="2026-03-19 09:30:30.281926104 +0000 UTC m=+485.130524691" Mar 19 09:30:33 crc kubenswrapper[4835]: I0319 09:30:33.614037 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:33 crc kubenswrapper[4835]: I0319 09:30:33.614297 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:33 crc kubenswrapper[4835]: I0319 09:30:33.672417 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:33 crc kubenswrapper[4835]: I0319 09:30:33.756063 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:33 crc kubenswrapper[4835]: I0319 09:30:33.756840 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:34 crc kubenswrapper[4835]: I0319 09:30:34.361542 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 09:30:34 crc kubenswrapper[4835]: I0319 09:30:34.801882 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppf8q" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="registry-server" probeResult="failure" output=< Mar 19 09:30:34 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:30:34 crc kubenswrapper[4835]: > Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.056671 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.056786 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.130576 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.146408 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.146442 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.216573 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.366968 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.376410 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.426451 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.426843 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.427072 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.432165 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:30:36 crc kubenswrapper[4835]: I0319 09:30:36.432270 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294" gracePeriod=600 Mar 19 09:30:37 crc kubenswrapper[4835]: I0319 09:30:37.306682 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294" exitCode=0 Mar 19 09:30:37 crc kubenswrapper[4835]: I0319 09:30:37.306767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294"} Mar 19 09:30:37 crc kubenswrapper[4835]: I0319 09:30:37.307012 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d"} Mar 19 09:30:37 crc kubenswrapper[4835]: I0319 09:30:37.307035 4835 scope.go:117] "RemoveContainer" containerID="685b0c65a4ff075e3bd2e96a9e1ef7e5a73ff1532a133b8cba8d5b03818133c3" Mar 19 09:30:43 crc kubenswrapper[4835]: I0319 09:30:43.829003 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:43 crc kubenswrapper[4835]: I0319 09:30:43.906441 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 09:30:54 crc kubenswrapper[4835]: I0319 09:30:54.809979 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.469060 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr"] Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.477912 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.481047 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr"] Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.481921 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.482145 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.482404 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.484177 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.484268 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.545128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zmb\" (UniqueName: \"kubernetes.io/projected/79943387-08dd-44f4-9f94-3d6f38fdc2a9-kube-api-access-p7zmb\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.545193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/79943387-08dd-44f4-9f94-3d6f38fdc2a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.545272 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/79943387-08dd-44f4-9f94-3d6f38fdc2a9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.646798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/79943387-08dd-44f4-9f94-3d6f38fdc2a9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.646952 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7zmb\" (UniqueName: \"kubernetes.io/projected/79943387-08dd-44f4-9f94-3d6f38fdc2a9-kube-api-access-p7zmb\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.647021 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/79943387-08dd-44f4-9f94-3d6f38fdc2a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.648635 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/79943387-08dd-44f4-9f94-3d6f38fdc2a9-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.662209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/79943387-08dd-44f4-9f94-3d6f38fdc2a9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.671399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7zmb\" (UniqueName: \"kubernetes.io/projected/79943387-08dd-44f4-9f94-3d6f38fdc2a9-kube-api-access-p7zmb\") pod \"cluster-monitoring-operator-6d5b84845-wgqtr\" (UID: \"79943387-08dd-44f4-9f94-3d6f38fdc2a9\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:14 crc kubenswrapper[4835]: I0319 09:31:14.805699 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" Mar 19 09:31:15 crc kubenswrapper[4835]: I0319 09:31:15.022616 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr"] Mar 19 09:31:15 crc kubenswrapper[4835]: I0319 09:31:15.534689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" event={"ID":"79943387-08dd-44f4-9f94-3d6f38fdc2a9","Type":"ContainerStarted","Data":"0baf427a230161d091c3df90d84fb57e65632b8cc53a02088802c8cabdca78a7"} Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.550172 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" event={"ID":"79943387-08dd-44f4-9f94-3d6f38fdc2a9","Type":"ContainerStarted","Data":"6eccd997f14ca240aa1274749264ef0c7c6d76e284fdd9cbf872ad730ae9d67d"} Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.574851 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-wgqtr" podStartSLOduration=1.264726944 podStartE2EDuration="3.574819707s" podCreationTimestamp="2026-03-19 09:31:14 +0000 UTC" firstStartedPulling="2026-03-19 09:31:15.037145128 +0000 UTC m=+529.885743715" lastFinishedPulling="2026-03-19 09:31:17.347237901 +0000 UTC m=+532.195836478" observedRunningTime="2026-03-19 09:31:17.570008705 +0000 UTC m=+532.418607332" watchObservedRunningTime="2026-03-19 09:31:17.574819707 +0000 UTC m=+532.423418324" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.908599 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mrmh"] Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.910227 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.925108 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mrmh"] Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997427 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-certificates\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997494 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997537 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-bound-sa-token\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997583 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n65cl\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-kube-api-access-n65cl\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-trusted-ca\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:17 crc kubenswrapper[4835]: I0319 09:31:17.997636 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-tls\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.022418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.026331 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc"] Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.026954 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.030092 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-cqs2f" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.030314 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.038797 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc"] Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099262 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/83538569-9d57-4fdb-83b2-03dbd62bad4d-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-qcvsc\" (UID: \"83538569-9d57-4fdb-83b2-03dbd62bad4d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-trusted-ca\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099339 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-tls\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099426 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-certificates\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099465 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-bound-sa-token\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.099582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n65cl\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-kube-api-access-n65cl\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.100052 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.100633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-certificates\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.100985 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-trusted-ca\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.113567 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-registry-tls\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.113615 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.116929 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-bound-sa-token\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.117280 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n65cl\" (UniqueName: \"kubernetes.io/projected/c5e8ffa0-9ad4-4092-8b48-44edf2cb9514-kube-api-access-n65cl\") pod \"image-registry-66df7c8f76-7mrmh\" (UID: \"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514\") " pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.200458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/83538569-9d57-4fdb-83b2-03dbd62bad4d-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-qcvsc\" (UID: \"83538569-9d57-4fdb-83b2-03dbd62bad4d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.204521 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/83538569-9d57-4fdb-83b2-03dbd62bad4d-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-qcvsc\" (UID: \"83538569-9d57-4fdb-83b2-03dbd62bad4d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.224282 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.342271 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.411680 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7mrmh"] Mar 19 09:31:18 crc kubenswrapper[4835]: W0319 09:31:18.429928 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e8ffa0_9ad4_4092_8b48_44edf2cb9514.slice/crio-99f760a2933065241970064ddc851c858ea74bfa2e776fcab36a87730495d77c WatchSource:0}: Error finding container 99f760a2933065241970064ddc851c858ea74bfa2e776fcab36a87730495d77c: Status 404 returned error can't find the container with id 99f760a2933065241970064ddc851c858ea74bfa2e776fcab36a87730495d77c Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.556000 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" event={"ID":"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514","Type":"ContainerStarted","Data":"f02ec0b27fe257e429c001d0d55111e60a064d1aff3c042a6a661a98a2c83327"} Mar 19 09:31:18 crc kubenswrapper[4835]: I0319 09:31:18.556035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" event={"ID":"c5e8ffa0-9ad4-4092-8b48-44edf2cb9514","Type":"ContainerStarted","Data":"99f760a2933065241970064ddc851c858ea74bfa2e776fcab36a87730495d77c"} Mar 19 09:31:19 crc kubenswrapper[4835]: I0319 09:31:19.139869 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" podStartSLOduration=2.139848311 podStartE2EDuration="2.139848311s" podCreationTimestamp="2026-03-19 09:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:31:18.575061521 +0000 UTC m=+533.423660108" watchObservedRunningTime="2026-03-19 09:31:19.139848311 +0000 UTC m=+533.988446898" Mar 19 09:31:19 crc kubenswrapper[4835]: I0319 09:31:19.142230 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc"] Mar 19 09:31:19 crc kubenswrapper[4835]: I0319 09:31:19.564443 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" event={"ID":"83538569-9d57-4fdb-83b2-03dbd62bad4d","Type":"ContainerStarted","Data":"e0ed5d692f213edf831fa531f5c6ae5ddfa87a30a4c8aef9192632d60ebe292b"} Mar 19 09:31:19 crc kubenswrapper[4835]: I0319 09:31:19.564901 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:21 crc kubenswrapper[4835]: I0319 09:31:21.579989 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" event={"ID":"83538569-9d57-4fdb-83b2-03dbd62bad4d","Type":"ContainerStarted","Data":"7e03f0a09c00c5c8d9a293fa8c6b0792bbec0d6bb5ee0df9df55edb3ec928ad3"} Mar 19 09:31:21 crc kubenswrapper[4835]: I0319 09:31:21.580444 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:21 crc kubenswrapper[4835]: I0319 09:31:21.589436 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" Mar 19 09:31:21 crc kubenswrapper[4835]: I0319 09:31:21.609863 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podStartSLOduration=2.001245097 podStartE2EDuration="3.609836691s" podCreationTimestamp="2026-03-19 09:31:18 +0000 UTC" firstStartedPulling="2026-03-19 09:31:19.146975857 +0000 UTC m=+533.995574444" lastFinishedPulling="2026-03-19 09:31:20.755567421 +0000 UTC m=+535.604166038" observedRunningTime="2026-03-19 09:31:21.605926601 +0000 UTC m=+536.454525228" watchObservedRunningTime="2026-03-19 09:31:21.609836691 +0000 UTC m=+536.458435318" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.084882 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pk795"] Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.086436 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.091344 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-rzrs4" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.091690 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.094954 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.095044 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.111811 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pk795"] Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.153629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tznm\" (UniqueName: \"kubernetes.io/projected/577ba078-ecd6-48f2-9021-9d6efff41017-kube-api-access-4tznm\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.153695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.153848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.153961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/577ba078-ecd6-48f2-9021-9d6efff41017-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.254966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/577ba078-ecd6-48f2-9021-9d6efff41017-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.255086 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tznm\" (UniqueName: \"kubernetes.io/projected/577ba078-ecd6-48f2-9021-9d6efff41017-kube-api-access-4tznm\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.255114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.255140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.255994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/577ba078-ecd6-48f2-9021-9d6efff41017-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.260209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.260209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/577ba078-ecd6-48f2-9021-9d6efff41017-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.274650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tznm\" (UniqueName: \"kubernetes.io/projected/577ba078-ecd6-48f2-9021-9d6efff41017-kube-api-access-4tznm\") pod \"prometheus-operator-db54df47d-pk795\" (UID: \"577ba078-ecd6-48f2-9021-9d6efff41017\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.409816 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" Mar 19 09:31:22 crc kubenswrapper[4835]: I0319 09:31:22.607856 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pk795"] Mar 19 09:31:23 crc kubenswrapper[4835]: I0319 09:31:23.596635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" event={"ID":"577ba078-ecd6-48f2-9021-9d6efff41017","Type":"ContainerStarted","Data":"53109f203d4b19437f264531867e3195d9c5c6038ca409f7c025894a1da3fc3e"} Mar 19 09:31:25 crc kubenswrapper[4835]: I0319 09:31:25.610672 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" event={"ID":"577ba078-ecd6-48f2-9021-9d6efff41017","Type":"ContainerStarted","Data":"a3f337156ff84621a6af4fc3d008a4889df98fdc89037cab21799f6e3b00629b"} Mar 19 09:31:25 crc kubenswrapper[4835]: I0319 09:31:25.611055 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" event={"ID":"577ba078-ecd6-48f2-9021-9d6efff41017","Type":"ContainerStarted","Data":"cf85356de487cabd530eb971d17daa69fb44fab460d313bcc4b7e368dfd2e2fa"} Mar 19 09:31:25 crc kubenswrapper[4835]: I0319 09:31:25.638697 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-pk795" podStartSLOduration=1.380854239 podStartE2EDuration="3.638680732s" podCreationTimestamp="2026-03-19 09:31:22 +0000 UTC" firstStartedPulling="2026-03-19 09:31:22.615875269 +0000 UTC m=+537.464473856" lastFinishedPulling="2026-03-19 09:31:24.873701762 +0000 UTC m=+539.722300349" observedRunningTime="2026-03-19 09:31:25.633921511 +0000 UTC m=+540.482520098" watchObservedRunningTime="2026-03-19 09:31:25.638680732 +0000 UTC m=+540.487279319" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.413430 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb"] Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.415846 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.418144 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-zmpsk" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.419193 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.419337 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.424267 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb"] Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.425495 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.427442 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.427614 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.427670 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-qhm76" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.429416 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.434472 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7tcl7"] Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.436640 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.438906 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.439349 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.443824 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-x8rd2" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.458634 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb"] Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.510431 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb"] Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527324 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789ba6b6-298f-401b-869d-1b3e914d471c-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527425 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527464 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527608 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527630 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mxcr\" (UniqueName: \"kubernetes.io/projected/789ba6b6-298f-401b-869d-1b3e914d471c-kube-api-access-9mxcr\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527670 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88874678-630f-48a5-82b2-1f5a0b0bbc15-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.527710 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9p9k\" (UniqueName: \"kubernetes.io/projected/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-api-access-k9p9k\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.629440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.629682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.629803 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.629897 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: E0319 09:31:27.629618 4835 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 19 09:31:27 crc kubenswrapper[4835]: E0319 09:31:27.630056 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls podName:88874678-630f-48a5-82b2-1f5a0b0bbc15 nodeName:}" failed. No retries permitted until 2026-03-19 09:31:28.130033569 +0000 UTC m=+542.978632256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-xj4pb" (UID: "88874678-630f-48a5-82b2-1f5a0b0bbc15") : secret "kube-state-metrics-tls" not found Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.629981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-wtmp\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630268 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630307 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-sys\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-metrics-client-ca\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630369 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mxcr\" (UniqueName: \"kubernetes.io/projected/789ba6b6-298f-401b-869d-1b3e914d471c-kube-api-access-9mxcr\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88874678-630f-48a5-82b2-1f5a0b0bbc15-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630494 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9p9k\" (UniqueName: \"kubernetes.io/projected/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-api-access-k9p9k\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630520 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789ba6b6-298f-401b-869d-1b3e914d471c-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-tls\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-textfile\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630589 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhwb\" (UniqueName: \"kubernetes.io/projected/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-kube-api-access-cqhwb\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.630616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-root\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.631201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.631205 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/88874678-630f-48a5-82b2-1f5a0b0bbc15-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.631420 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88874678-630f-48a5-82b2-1f5a0b0bbc15-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.632159 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/789ba6b6-298f-401b-869d-1b3e914d471c-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.636472 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.636510 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.637227 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/789ba6b6-298f-401b-869d-1b3e914d471c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.655221 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mxcr\" (UniqueName: \"kubernetes.io/projected/789ba6b6-298f-401b-869d-1b3e914d471c-kube-api-access-9mxcr\") pod \"openshift-state-metrics-566fddb674-jh6lb\" (UID: \"789ba6b6-298f-401b-869d-1b3e914d471c\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.666243 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9p9k\" (UniqueName: \"kubernetes.io/projected/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-api-access-k9p9k\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-tls\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732292 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-textfile\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhwb\" (UniqueName: \"kubernetes.io/projected/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-kube-api-access-cqhwb\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-root\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732410 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-wtmp\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-sys\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-metrics-client-ca\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.732864 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-root\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.733159 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-metrics-client-ca\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.733185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-textfile\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.733548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-wtmp\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.733579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-sys\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.735383 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-tls\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.735722 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.739870 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.747814 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhwb\" (UniqueName: \"kubernetes.io/projected/7b29a5b9-32ec-450d-80c4-25f8a2e50eff-kube-api-access-cqhwb\") pod \"node-exporter-7tcl7\" (UID: \"7b29a5b9-32ec-450d-80c4-25f8a2e50eff\") " pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:27 crc kubenswrapper[4835]: I0319 09:31:27.776367 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7tcl7" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.138781 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.144505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/88874678-630f-48a5-82b2-1f5a0b0bbc15-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-xj4pb\" (UID: \"88874678-630f-48a5-82b2-1f5a0b0bbc15\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.215809 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb"] Mar 19 09:31:28 crc kubenswrapper[4835]: W0319 09:31:28.222640 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789ba6b6_298f_401b_869d_1b3e914d471c.slice/crio-a595539ceaae015673e29d68abd681c630387972ca4d655c46f39f5417be7304 WatchSource:0}: Error finding container a595539ceaae015673e29d68abd681c630387972ca4d655c46f39f5417be7304: Status 404 returned error can't find the container with id a595539ceaae015673e29d68abd681c630387972ca4d655c46f39f5417be7304 Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.359759 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.542912 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.545346 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.549554 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.549627 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.549853 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-8jm2g" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.549926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.550063 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.550069 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.550191 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.552966 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.558350 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.569606 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.624182 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb"] Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.629945 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" event={"ID":"789ba6b6-298f-401b-869d-1b3e914d471c","Type":"ContainerStarted","Data":"897371f5f73c1a6e83ad18a2e68b827c82aa6709f0cbf130308701d6924442c3"} Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.629991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" event={"ID":"789ba6b6-298f-401b-869d-1b3e914d471c","Type":"ContainerStarted","Data":"8f423cebab5fa3e9611b14644842593481a2a22381e4ddd8fe5b2d4dd15880ac"} Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.630005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" event={"ID":"789ba6b6-298f-401b-869d-1b3e914d471c","Type":"ContainerStarted","Data":"a595539ceaae015673e29d68abd681c630387972ca4d655c46f39f5417be7304"} Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.644624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kscvm\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-kube-api-access-kscvm\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.644674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.644705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: W0319 09:31:28.644940 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88874678_630f_48a5_82b2_1f5a0b0bbc15.slice/crio-1f378c391a316136443772f119a86a89e210f12e1fdd5269206ddd09dac5f324 WatchSource:0}: Error finding container 1f378c391a316136443772f119a86a89e210f12e1fdd5269206ddd09dac5f324: Status 404 returned error can't find the container with id 1f378c391a316136443772f119a86a89e210f12e1fdd5269206ddd09dac5f324 Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645304 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645473 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-config-out\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645534 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645552 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645604 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-web-config\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.645675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.656191 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7tcl7" event={"ID":"7b29a5b9-32ec-450d-80c4-25f8a2e50eff","Type":"ContainerStarted","Data":"366563cc4c52f350518e685c9fbdb5ada3f5eb427907bafbe916f6ea72dbda24"} Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-config-out\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747562 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747582 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747614 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-web-config\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747708 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kscvm\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-kube-api-access-kscvm\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747734 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.747805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.748327 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.749166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.749357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68d4bc50-49ba-4060-88ef-01257302101f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.752150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-config-volume\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.752946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.756688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-web-config\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.756968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.757447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.762894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kscvm\" (UniqueName: \"kubernetes.io/projected/68d4bc50-49ba-4060-88ef-01257302101f-kube-api-access-kscvm\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.769117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.769920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/68d4bc50-49ba-4060-88ef-01257302101f-config-out\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.775975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/68d4bc50-49ba-4060-88ef-01257302101f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"68d4bc50-49ba-4060-88ef-01257302101f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:28 crc kubenswrapper[4835]: I0319 09:31:28.869083 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.235234 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 19 09:31:29 crc kubenswrapper[4835]: W0319 09:31:29.237865 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d4bc50_49ba_4060_88ef_01257302101f.slice/crio-542d40a9ed84c196a8de93bed35598f1ee0f4b49cdfb55a9f791ae02efe8a20c WatchSource:0}: Error finding container 542d40a9ed84c196a8de93bed35598f1ee0f4b49cdfb55a9f791ae02efe8a20c: Status 404 returned error can't find the container with id 542d40a9ed84c196a8de93bed35598f1ee0f4b49cdfb55a9f791ae02efe8a20c Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.397657 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5b688b744b-2d86m"] Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.399183 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.401493 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.401666 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.401790 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.401990 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.402113 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-71kq2rblmqlds" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.402385 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-pm4tf" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.402575 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.412347 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b688b744b-2d86m"] Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlddt\" (UniqueName: \"kubernetes.io/projected/c02bf820-ae15-4787-964c-de647e0e6ebd-kube-api-access-zlddt\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02bf820-ae15-4787-964c-de647e0e6ebd-metrics-client-ca\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-grpc-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460341 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460415 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.460540 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562102 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562157 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562256 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562300 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlddt\" (UniqueName: \"kubernetes.io/projected/c02bf820-ae15-4787-964c-de647e0e6ebd-kube-api-access-zlddt\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562323 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02bf820-ae15-4787-964c-de647e0e6ebd-metrics-client-ca\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562348 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-grpc-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.562382 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.565268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02bf820-ae15-4787-964c-de647e0e6ebd-metrics-client-ca\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.567782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.567802 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.568008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.568109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-grpc-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.568873 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.569261 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c02bf820-ae15-4787-964c-de647e0e6ebd-secret-thanos-querier-tls\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.579715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlddt\" (UniqueName: \"kubernetes.io/projected/c02bf820-ae15-4787-964c-de647e0e6ebd-kube-api-access-zlddt\") pod \"thanos-querier-5b688b744b-2d86m\" (UID: \"c02bf820-ae15-4787-964c-de647e0e6ebd\") " pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.665386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" event={"ID":"88874678-630f-48a5-82b2-1f5a0b0bbc15","Type":"ContainerStarted","Data":"1f378c391a316136443772f119a86a89e210f12e1fdd5269206ddd09dac5f324"} Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.666849 4835 generic.go:334] "Generic (PLEG): container finished" podID="7b29a5b9-32ec-450d-80c4-25f8a2e50eff" containerID="daf34cbf7de1b8124a59914940ee67d9008f67a4cc9992e08f3b82954b8bf284" exitCode=0 Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.666904 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7tcl7" event={"ID":"7b29a5b9-32ec-450d-80c4-25f8a2e50eff","Type":"ContainerDied","Data":"daf34cbf7de1b8124a59914940ee67d9008f67a4cc9992e08f3b82954b8bf284"} Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.668761 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"542d40a9ed84c196a8de93bed35598f1ee0f4b49cdfb55a9f791ae02efe8a20c"} Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.721238 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:29 crc kubenswrapper[4835]: I0319 09:31:29.978358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b688b744b-2d86m"] Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.684929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" event={"ID":"789ba6b6-298f-401b-869d-1b3e914d471c","Type":"ContainerStarted","Data":"e2b47eff26357308133228670260e80303d7abb22f2bcc928b2b8d1acd34bb93"} Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.687266 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"a609b7fbd3ec9c45d775981750ea49220eae67b989338385a2c4b6e39fa574ed"} Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.690792 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7tcl7" event={"ID":"7b29a5b9-32ec-450d-80c4-25f8a2e50eff","Type":"ContainerStarted","Data":"f1be294957ce483dba6758411bfdaf13e989260a8dc898c655015f52c37cacc1"} Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.690832 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7tcl7" event={"ID":"7b29a5b9-32ec-450d-80c4-25f8a2e50eff","Type":"ContainerStarted","Data":"1b315d394bcb48dbe6cdff8b61716eb8e0f08764992f68af08453513ffe6007d"} Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.706311 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-jh6lb" podStartSLOduration=2.409092754 podStartE2EDuration="3.706292051s" podCreationTimestamp="2026-03-19 09:31:27 +0000 UTC" firstStartedPulling="2026-03-19 09:31:28.509516444 +0000 UTC m=+543.358115021" lastFinishedPulling="2026-03-19 09:31:29.806715731 +0000 UTC m=+544.655314318" observedRunningTime="2026-03-19 09:31:30.703843594 +0000 UTC m=+545.552442201" watchObservedRunningTime="2026-03-19 09:31:30.706292051 +0000 UTC m=+545.554890638" Mar 19 09:31:30 crc kubenswrapper[4835]: I0319 09:31:30.738984 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7tcl7" podStartSLOduration=2.481084623 podStartE2EDuration="3.738959049s" podCreationTimestamp="2026-03-19 09:31:27 +0000 UTC" firstStartedPulling="2026-03-19 09:31:27.799503417 +0000 UTC m=+542.648102014" lastFinishedPulling="2026-03-19 09:31:29.057377813 +0000 UTC m=+543.905976440" observedRunningTime="2026-03-19 09:31:30.728580018 +0000 UTC m=+545.577178605" watchObservedRunningTime="2026-03-19 09:31:30.738959049 +0000 UTC m=+545.587557636" Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.696923 4835 generic.go:334] "Generic (PLEG): container finished" podID="68d4bc50-49ba-4060-88ef-01257302101f" containerID="9a836b23860246d388a6afebe5cfe74cd83b1ccef5db506279c0c16fcc7c22e8" exitCode=0 Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.697025 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerDied","Data":"9a836b23860246d388a6afebe5cfe74cd83b1ccef5db506279c0c16fcc7c22e8"} Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.701370 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" event={"ID":"88874678-630f-48a5-82b2-1f5a0b0bbc15","Type":"ContainerStarted","Data":"3bac7885bcb55d49239a824bd18fe6a4c6d1d70bebaaf57c00084933f307e33f"} Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.701419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" event={"ID":"88874678-630f-48a5-82b2-1f5a0b0bbc15","Type":"ContainerStarted","Data":"eee694326689e575ff4df4625ec5e4f04f1f7ae6225d899f2cf488a0a5a81aac"} Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.701430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" event={"ID":"88874678-630f-48a5-82b2-1f5a0b0bbc15","Type":"ContainerStarted","Data":"a0b1747c4bf92324ed81e4fb3c21725898ce6e7e4a47d46dd427fc62022bea00"} Mar 19 09:31:31 crc kubenswrapper[4835]: I0319 09:31:31.748716 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-xj4pb" podStartSLOduration=2.853406043 podStartE2EDuration="4.748699082s" podCreationTimestamp="2026-03-19 09:31:27 +0000 UTC" firstStartedPulling="2026-03-19 09:31:28.648606798 +0000 UTC m=+543.497205385" lastFinishedPulling="2026-03-19 09:31:30.543899837 +0000 UTC m=+545.392498424" observedRunningTime="2026-03-19 09:31:31.74772234 +0000 UTC m=+546.596320937" watchObservedRunningTime="2026-03-19 09:31:31.748699082 +0000 UTC m=+546.597297669" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.246696 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.249821 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.262212 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.306889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.306926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.306961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.306999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lkn\" (UniqueName: \"kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.307021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.307043 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.307107 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407861 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407905 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407926 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.407987 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lkn\" (UniqueName: \"kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.408008 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.408815 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.409028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.409239 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.409249 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.415372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.419119 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.425421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lkn\" (UniqueName: \"kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn\") pod \"console-57596b75ff-6879j\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.563671 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.753116 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"671d068c412794d802af618334b4061fca5227982d415da4c3343ac83e82f509"} Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.759549 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8646b978bb-zprxl"] Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.767818 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.770083 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-5qftv" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.772797 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8646b978bb-zprxl"] Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.773323 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-91rri2rv4dgo4" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.773649 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.773938 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.774010 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.774043 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.914832 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzm6h\" (UniqueName: \"kubernetes.io/projected/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-kube-api-access-jzm6h\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915011 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-audit-log\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915174 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-server-tls\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915258 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-client-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915345 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-client-certs\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915376 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-metrics-server-audit-profiles\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.915406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.956554 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:31:32 crc kubenswrapper[4835]: W0319 09:31:32.966019 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeecbe809_4eb3_40d6_abe0_71a3c338b453.slice/crio-58b9b4634f8b7c9d3b4ada90fd099bee5b64e488edc214afec585ae2cc99ba39 WatchSource:0}: Error finding container 58b9b4634f8b7c9d3b4ada90fd099bee5b64e488edc214afec585ae2cc99ba39: Status 404 returned error can't find the container with id 58b9b4634f8b7c9d3b4ada90fd099bee5b64e488edc214afec585ae2cc99ba39 Mar 19 09:31:32 crc kubenswrapper[4835]: I0319 09:31:32.994425 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.016678 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-server-tls\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-client-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-client-certs\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017277 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-metrics-server-audit-profiles\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzm6h\" (UniqueName: \"kubernetes.io/projected/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-kube-api-access-jzm6h\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-audit-log\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.017880 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-audit-log\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.019225 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.020080 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-metrics-server-audit-profiles\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.024003 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-client-ca-bundle\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.024920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-client-certs\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.025099 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-secret-metrics-server-tls\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.038810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzm6h\" (UniqueName: \"kubernetes.io/projected/da878929-ea5e-40f0-8eaf-7f6b6e86f62c-kube-api-access-jzm6h\") pod \"metrics-server-8646b978bb-zprxl\" (UID: \"da878929-ea5e-40f0-8eaf-7f6b6e86f62c\") " pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.085425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.226959 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6"] Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.229524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.231979 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.232060 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.236384 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6"] Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.285021 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8646b978bb-zprxl"] Mar 19 09:31:33 crc kubenswrapper[4835]: W0319 09:31:33.294226 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda878929_ea5e_40f0_8eaf_7f6b6e86f62c.slice/crio-b5cc1e0be9393c0b6c8f953e98e0ba9dfab1ddeee34676360d7311a5f87f13ea WatchSource:0}: Error finding container b5cc1e0be9393c0b6c8f953e98e0ba9dfab1ddeee34676360d7311a5f87f13ea: Status 404 returned error can't find the container with id b5cc1e0be9393c0b6c8f953e98e0ba9dfab1ddeee34676360d7311a5f87f13ea Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.328124 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5573de0d-e8de-4c32-b778-1cf95556c219-monitoring-plugin-cert\") pod \"monitoring-plugin-7b6f7975cf-xd4n6\" (UID: \"5573de0d-e8de-4c32-b778-1cf95556c219\") " pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.429811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5573de0d-e8de-4c32-b778-1cf95556c219-monitoring-plugin-cert\") pod \"monitoring-plugin-7b6f7975cf-xd4n6\" (UID: \"5573de0d-e8de-4c32-b778-1cf95556c219\") " pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.434991 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5573de0d-e8de-4c32-b778-1cf95556c219-monitoring-plugin-cert\") pod \"monitoring-plugin-7b6f7975cf-xd4n6\" (UID: \"5573de0d-e8de-4c32-b778-1cf95556c219\") " pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.547444 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.706638 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.709140 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711482 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711607 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-d4r6c4db3ab7u" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711680 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711848 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.711972 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.715154 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.716029 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.716403 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.716524 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.716973 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-t95wx" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.721849 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.723621 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.724095 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.760073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57596b75ff-6879j" event={"ID":"eecbe809-4eb3-40d6-abe0-71a3c338b453","Type":"ContainerStarted","Data":"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920"} Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.760110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57596b75ff-6879j" event={"ID":"eecbe809-4eb3-40d6-abe0-71a3c338b453","Type":"ContainerStarted","Data":"58b9b4634f8b7c9d3b4ada90fd099bee5b64e488edc214afec585ae2cc99ba39"} Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.772500 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"a7c8cda2a05a37565194e71a518fa1e3ecff5a52e82338cddc21c11c287d01bb"} Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.772547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"3377368c0e0632e3bc866e28e6a081ef7967b9c9066abb5f0c56c6a2fc2503e8"} Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.773530 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" event={"ID":"da878929-ea5e-40f0-8eaf-7f6b6e86f62c","Type":"ContainerStarted","Data":"b5cc1e0be9393c0b6c8f953e98e0ba9dfab1ddeee34676360d7311a5f87f13ea"} Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.784484 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57596b75ff-6879j" podStartSLOduration=1.784467478 podStartE2EDuration="1.784467478s" podCreationTimestamp="2026-03-19 09:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:31:33.782194815 +0000 UTC m=+548.630793412" watchObservedRunningTime="2026-03-19 09:31:33.784467478 +0000 UTC m=+548.633066065" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.836877 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.836926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.836953 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77slz\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-kube-api-access-77slz\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.836973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837004 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837025 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config-out\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837161 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837187 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837230 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-web-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837309 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837329 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837357 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.837389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940345 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940402 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-web-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940477 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940660 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77slz\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-kube-api-access-77slz\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940680 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940711 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940734 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config-out\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940928 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.940953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.943140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.946811 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.947521 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.949198 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.949254 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config-out\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.949649 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.949797 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.950642 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.950829 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-web-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.951245 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.952356 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30460e8f-6c1c-4bdb-a1d7-18427b62896b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.952833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.954139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.954293 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-config\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.955644 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.956136 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.956475 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30460e8f-6c1c-4bdb-a1d7-18427b62896b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:33 crc kubenswrapper[4835]: I0319 09:31:33.958075 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77slz\" (UniqueName: \"kubernetes.io/projected/30460e8f-6c1c-4bdb-a1d7-18427b62896b-kube-api-access-77slz\") pod \"prometheus-k8s-0\" (UID: \"30460e8f-6c1c-4bdb-a1d7-18427b62896b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.031227 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.425555 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 19 09:31:34 crc kubenswrapper[4835]: W0319 09:31:34.443786 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30460e8f_6c1c_4bdb_a1d7_18427b62896b.slice/crio-fed191c76775d2c1818fdeb520b4e2a2cf24321c5aa7e3ced6ae8c573b29055f WatchSource:0}: Error finding container fed191c76775d2c1818fdeb520b4e2a2cf24321c5aa7e3ced6ae8c573b29055f: Status 404 returned error can't find the container with id fed191c76775d2c1818fdeb520b4e2a2cf24321c5aa7e3ced6ae8c573b29055f Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.565279 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6"] Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.780659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" event={"ID":"5573de0d-e8de-4c32-b778-1cf95556c219","Type":"ContainerStarted","Data":"8cbbfe902fd4081e55d0592dffeab225066f1e938bdb10812d5288972792f425"} Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.785990 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"772f67818846a673527e8bf8ac177e440710591e178acd34376f2ff476914ee3"} Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.786047 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"2e06f8fa27138c3c45b393059876faab32827b20aa08684327dbd5c7624097ab"} Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.788398 4835 generic.go:334] "Generic (PLEG): container finished" podID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerID="1c67cc07125d0f6d1e3e428ae2ed798795367888a92b9e3e9e5e9cf456023363" exitCode=0 Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.788478 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerDied","Data":"1c67cc07125d0f6d1e3e428ae2ed798795367888a92b9e3e9e5e9cf456023363"} Mar 19 09:31:34 crc kubenswrapper[4835]: I0319 09:31:34.788505 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"fed191c76775d2c1818fdeb520b4e2a2cf24321c5aa7e3ced6ae8c573b29055f"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.797506 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"b76f19beb02d41a5ef5ed9c3d04bc74f740a7872eb5fbdb4bde59fe03e6f3733"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.797811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"43ac24c10e273b4cb721b69429d5ff8a8455ba8389ada5d4bf63e3ea6e3af908"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.797832 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.797847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" event={"ID":"c02bf820-ae15-4787-964c-de647e0e6ebd","Type":"ContainerStarted","Data":"417d52792b4e184492117ee951bca2e153e246564a5649bab6106ec6ed8b6a71"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.799911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" event={"ID":"da878929-ea5e-40f0-8eaf-7f6b6e86f62c","Type":"ContainerStarted","Data":"4bde12fa97e46cd393be8e222a95cd64b25ba816714628be50c94aa41aae9643"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.804338 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"27cd63acee62fa83c566e37ce3bf737e5dcfe3a95a559049d510369ee3235e6b"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.804382 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"167812a088a82aaced2843bfb0c6260a57996938f5f723844b3d4fd41602c4ff"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.804393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"71b999177f59717e44a8485fc352db5ada3c820360e8e78d7dfff2b83eb0fd17"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.804403 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"68d4bc50-49ba-4060-88ef-01257302101f","Type":"ContainerStarted","Data":"dca3ef0a543ff44abfefbb887181cf10c2d3a28e53e53b07f5e24ba791fa98a1"} Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.818222 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podStartSLOduration=2.042238898 podStartE2EDuration="6.818201947s" podCreationTimestamp="2026-03-19 09:31:29 +0000 UTC" firstStartedPulling="2026-03-19 09:31:29.996107481 +0000 UTC m=+544.844706068" lastFinishedPulling="2026-03-19 09:31:34.77207053 +0000 UTC m=+549.620669117" observedRunningTime="2026-03-19 09:31:35.817235245 +0000 UTC m=+550.665833842" watchObservedRunningTime="2026-03-19 09:31:35.818201947 +0000 UTC m=+550.666800544" Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.839714 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podStartSLOduration=1.66288891 podStartE2EDuration="3.839695635s" podCreationTimestamp="2026-03-19 09:31:32 +0000 UTC" firstStartedPulling="2026-03-19 09:31:33.296815345 +0000 UTC m=+548.145413932" lastFinishedPulling="2026-03-19 09:31:35.47362207 +0000 UTC m=+550.322220657" observedRunningTime="2026-03-19 09:31:35.831344792 +0000 UTC m=+550.679943389" watchObservedRunningTime="2026-03-19 09:31:35.839695635 +0000 UTC m=+550.688294222" Mar 19 09:31:35 crc kubenswrapper[4835]: I0319 09:31:35.860002 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.883109749 podStartE2EDuration="7.859982495s" podCreationTimestamp="2026-03-19 09:31:28 +0000 UTC" firstStartedPulling="2026-03-19 09:31:29.24064957 +0000 UTC m=+544.089248157" lastFinishedPulling="2026-03-19 09:31:34.217522316 +0000 UTC m=+549.066120903" observedRunningTime="2026-03-19 09:31:35.857825915 +0000 UTC m=+550.706424522" watchObservedRunningTime="2026-03-19 09:31:35.859982495 +0000 UTC m=+550.708581092" Mar 19 09:31:37 crc kubenswrapper[4835]: I0319 09:31:37.819307 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" event={"ID":"5573de0d-e8de-4c32-b778-1cf95556c219","Type":"ContainerStarted","Data":"e9d523e82f5fcc16a85097ba9569ef6169314c8f813893acbf645a1db88e59df"} Mar 19 09:31:37 crc kubenswrapper[4835]: I0319 09:31:37.819854 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:37 crc kubenswrapper[4835]: I0319 09:31:37.826650 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 09:31:37 crc kubenswrapper[4835]: I0319 09:31:37.847117 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podStartSLOduration=2.913478265 podStartE2EDuration="4.847094623s" podCreationTimestamp="2026-03-19 09:31:33 +0000 UTC" firstStartedPulling="2026-03-19 09:31:34.755624268 +0000 UTC m=+549.604222855" lastFinishedPulling="2026-03-19 09:31:36.689240596 +0000 UTC m=+551.537839213" observedRunningTime="2026-03-19 09:31:37.836686291 +0000 UTC m=+552.685284908" watchObservedRunningTime="2026-03-19 09:31:37.847094623 +0000 UTC m=+552.695693210" Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.234309 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.298436 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.826150 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"e70d91ec6ac52dd78a7a948bb3b2216629d92850550dcd22710d484d9de624af"} Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.826223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"9f844845a0072d06bfdfcf88ab30d7ca479c7d751cdaa5059774a70192c3f967"} Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.826240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"85f23f10d6748494db2522f538184e736c11ae0a8dc7c1f7ac34668b8887bfe4"} Mar 19 09:31:38 crc kubenswrapper[4835]: I0319 09:31:38.826253 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"5d5597328182182184b85176901ee18e434a6deacb805d79ed16b2a8d2a35952"} Mar 19 09:31:39 crc kubenswrapper[4835]: I0319 09:31:39.736000 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" Mar 19 09:31:39 crc kubenswrapper[4835]: I0319 09:31:39.854031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"bd64300d3e8cd281af6d4627269a48f18e8231341b59adc00685b331f154f3d1"} Mar 19 09:31:39 crc kubenswrapper[4835]: I0319 09:31:39.854088 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30460e8f-6c1c-4bdb-a1d7-18427b62896b","Type":"ContainerStarted","Data":"e539660b35238dce2f8351fea0c70bb2fe0842006ce9ffb06094af169850648d"} Mar 19 09:31:39 crc kubenswrapper[4835]: I0319 09:31:39.900498 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.512281303 podStartE2EDuration="6.900474236s" podCreationTimestamp="2026-03-19 09:31:33 +0000 UTC" firstStartedPulling="2026-03-19 09:31:34.791664623 +0000 UTC m=+549.640263210" lastFinishedPulling="2026-03-19 09:31:38.179857546 +0000 UTC m=+553.028456143" observedRunningTime="2026-03-19 09:31:39.891294974 +0000 UTC m=+554.739893601" watchObservedRunningTime="2026-03-19 09:31:39.900474236 +0000 UTC m=+554.749072833" Mar 19 09:31:42 crc kubenswrapper[4835]: I0319 09:31:42.564573 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:42 crc kubenswrapper[4835]: I0319 09:31:42.565025 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:42 crc kubenswrapper[4835]: I0319 09:31:42.571984 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:42 crc kubenswrapper[4835]: I0319 09:31:42.879532 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:31:42 crc kubenswrapper[4835]: I0319 09:31:42.937617 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:31:44 crc kubenswrapper[4835]: I0319 09:31:44.031444 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:31:53 crc kubenswrapper[4835]: I0319 09:31:53.086425 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:31:53 crc kubenswrapper[4835]: I0319 09:31:53.087371 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.137266 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565212-xrbm4"] Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.138676 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565212-xrbm4"] Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.138820 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.140408 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.140702 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.142225 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.240379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9kl2\" (UniqueName: \"kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2\") pod \"auto-csr-approver-29565212-xrbm4\" (UID: \"545bb00f-039b-40bc-923f-496674c4b221\") " pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.342290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9kl2\" (UniqueName: \"kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2\") pod \"auto-csr-approver-29565212-xrbm4\" (UID: \"545bb00f-039b-40bc-923f-496674c4b221\") " pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.375175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9kl2\" (UniqueName: \"kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2\") pod \"auto-csr-approver-29565212-xrbm4\" (UID: \"545bb00f-039b-40bc-923f-496674c4b221\") " pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.466615 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:00 crc kubenswrapper[4835]: I0319 09:32:00.742132 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565212-xrbm4"] Mar 19 09:32:01 crc kubenswrapper[4835]: I0319 09:32:01.014830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" event={"ID":"545bb00f-039b-40bc-923f-496674c4b221","Type":"ContainerStarted","Data":"09d0d91d995099215d5d4b3b40b5d08135a82a5a72c47a5461c45b265f64444f"} Mar 19 09:32:02 crc kubenswrapper[4835]: E0319 09:32:02.482104 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod545bb00f_039b_40bc_923f_496674c4b221.slice/crio-conmon-7d5d6c63cc1fe44b111a2a6b05400858e8dbb7d8e396e04fd0e5d1b4d01c1877.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:32:03 crc kubenswrapper[4835]: I0319 09:32:03.034915 4835 generic.go:334] "Generic (PLEG): container finished" podID="545bb00f-039b-40bc-923f-496674c4b221" containerID="7d5d6c63cc1fe44b111a2a6b05400858e8dbb7d8e396e04fd0e5d1b4d01c1877" exitCode=0 Mar 19 09:32:03 crc kubenswrapper[4835]: I0319 09:32:03.034975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" event={"ID":"545bb00f-039b-40bc-923f-496674c4b221","Type":"ContainerDied","Data":"7d5d6c63cc1fe44b111a2a6b05400858e8dbb7d8e396e04fd0e5d1b4d01c1877"} Mar 19 09:32:03 crc kubenswrapper[4835]: I0319 09:32:03.338992 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" podUID="68957ece-d303-4cd7-9e45-cc960a83a7b0" containerName="registry" containerID="cri-o://c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1" gracePeriod=30 Mar 19 09:32:03 crc kubenswrapper[4835]: I0319 09:32:03.805839 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.005299 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006156 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006398 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006436 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006547 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.006684 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64tvc\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc\") pod \"68957ece-d303-4cd7-9e45-cc960a83a7b0\" (UID: \"68957ece-d303-4cd7-9e45-cc960a83a7b0\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.007116 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.007324 4835 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.007371 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68957ece-d303-4cd7-9e45-cc960a83a7b0-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.013418 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.014310 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.020788 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.024380 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc" (OuterVolumeSpecName: "kube-api-access-64tvc") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "kube-api-access-64tvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.024828 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.042689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "68957ece-d303-4cd7-9e45-cc960a83a7b0" (UID: "68957ece-d303-4cd7-9e45-cc960a83a7b0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.046906 4835 generic.go:334] "Generic (PLEG): container finished" podID="68957ece-d303-4cd7-9e45-cc960a83a7b0" containerID="c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1" exitCode=0 Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.047001 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" event={"ID":"68957ece-d303-4cd7-9e45-cc960a83a7b0","Type":"ContainerDied","Data":"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1"} Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.047024 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.047100 4835 scope.go:117] "RemoveContainer" containerID="c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.047081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-46rgq" event={"ID":"68957ece-d303-4cd7-9e45-cc960a83a7b0","Type":"ContainerDied","Data":"2b72c853a4cb9cfa6b41621158b5f31b75aeb44a97c003f74c4cd687ce35e8b5"} Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.109863 4835 scope.go:117] "RemoveContainer" containerID="c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1" Mar 19 09:32:04 crc kubenswrapper[4835]: E0319 09:32:04.110584 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1\": container with ID starting with c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1 not found: ID does not exist" containerID="c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110647 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1"} err="failed to get container status \"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1\": rpc error: code = NotFound desc = could not find container \"c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1\": container with ID starting with c91f34d7ba02037f3446f7dfd3fff27ad58d15499dde9c494068d6a0eda481a1 not found: ID does not exist" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110903 4835 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68957ece-d303-4cd7-9e45-cc960a83a7b0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110943 4835 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110960 4835 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68957ece-d303-4cd7-9e45-cc960a83a7b0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110974 4835 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.110988 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64tvc\" (UniqueName: \"kubernetes.io/projected/68957ece-d303-4cd7-9e45-cc960a83a7b0-kube-api-access-64tvc\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.115927 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.125518 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-46rgq"] Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.365886 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.413065 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68957ece-d303-4cd7-9e45-cc960a83a7b0" path="/var/lib/kubelet/pods/68957ece-d303-4cd7-9e45-cc960a83a7b0/volumes" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.413301 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9kl2\" (UniqueName: \"kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2\") pod \"545bb00f-039b-40bc-923f-496674c4b221\" (UID: \"545bb00f-039b-40bc-923f-496674c4b221\") " Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.417086 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2" (OuterVolumeSpecName: "kube-api-access-m9kl2") pod "545bb00f-039b-40bc-923f-496674c4b221" (UID: "545bb00f-039b-40bc-923f-496674c4b221"). InnerVolumeSpecName "kube-api-access-m9kl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:32:04 crc kubenswrapper[4835]: I0319 09:32:04.515646 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9kl2\" (UniqueName: \"kubernetes.io/projected/545bb00f-039b-40bc-923f-496674c4b221-kube-api-access-m9kl2\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:05 crc kubenswrapper[4835]: I0319 09:32:05.057332 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" event={"ID":"545bb00f-039b-40bc-923f-496674c4b221","Type":"ContainerDied","Data":"09d0d91d995099215d5d4b3b40b5d08135a82a5a72c47a5461c45b265f64444f"} Mar 19 09:32:05 crc kubenswrapper[4835]: I0319 09:32:05.060648 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d0d91d995099215d5d4b3b40b5d08135a82a5a72c47a5461c45b265f64444f" Mar 19 09:32:05 crc kubenswrapper[4835]: I0319 09:32:05.057342 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565212-xrbm4" Mar 19 09:32:05 crc kubenswrapper[4835]: I0319 09:32:05.450987 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565206-wgvsr"] Mar 19 09:32:05 crc kubenswrapper[4835]: I0319 09:32:05.457151 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565206-wgvsr"] Mar 19 09:32:06 crc kubenswrapper[4835]: I0319 09:32:06.414567 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c232f12f-f539-42a1-9eec-965d59127ca0" path="/var/lib/kubelet/pods/c232f12f-f539-42a1-9eec-965d59127ca0/volumes" Mar 19 09:32:07 crc kubenswrapper[4835]: I0319 09:32:07.994857 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-m6s8l" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerName="console" containerID="cri-o://db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014" gracePeriod=15 Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.357790 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m6s8l_c32cf7e2-4523-41b3-b8a8-7564cca67f4a/console/0.log" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.358298 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.478944 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.478984 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.479021 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.479041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn79n\" (UniqueName: \"kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.479067 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.479917 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.479959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480094 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert\") pod \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\" (UID: \"c32cf7e2-4523-41b3-b8a8-7564cca67f4a\") " Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480246 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480256 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480268 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca" (OuterVolumeSpecName: "service-ca") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.480476 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config" (OuterVolumeSpecName: "console-config") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.485932 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.486113 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n" (OuterVolumeSpecName: "kube-api-access-kn79n") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "kube-api-access-kn79n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.486356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c32cf7e2-4523-41b3-b8a8-7564cca67f4a" (UID: "c32cf7e2-4523-41b3-b8a8-7564cca67f4a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.581037 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn79n\" (UniqueName: \"kubernetes.io/projected/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-kube-api-access-kn79n\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.581075 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.581091 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.581107 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:08 crc kubenswrapper[4835]: I0319 09:32:08.581125 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c32cf7e2-4523-41b3-b8a8-7564cca67f4a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.087892 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m6s8l_c32cf7e2-4523-41b3-b8a8-7564cca67f4a/console/0.log" Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.088224 4835 generic.go:334] "Generic (PLEG): container finished" podID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerID="db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014" exitCode=2 Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.088255 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m6s8l" event={"ID":"c32cf7e2-4523-41b3-b8a8-7564cca67f4a","Type":"ContainerDied","Data":"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014"} Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.088282 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m6s8l" event={"ID":"c32cf7e2-4523-41b3-b8a8-7564cca67f4a","Type":"ContainerDied","Data":"7f826b21797f767ea24461f5577dab5a1f8e75b977689b0d8e294d4c8cc575e9"} Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.088287 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m6s8l" Mar 19 09:32:09 crc kubenswrapper[4835]: I0319 09:32:09.088300 4835 scope.go:117] "RemoveContainer" containerID="db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014" Mar 19 09:32:10 crc kubenswrapper[4835]: I0319 09:32:10.525850 4835 scope.go:117] "RemoveContainer" containerID="db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014" Mar 19 09:32:10 crc kubenswrapper[4835]: E0319 09:32:10.526686 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014\": container with ID starting with db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014 not found: ID does not exist" containerID="db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014" Mar 19 09:32:10 crc kubenswrapper[4835]: I0319 09:32:10.526718 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014"} err="failed to get container status \"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014\": rpc error: code = NotFound desc = could not find container \"db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014\": container with ID starting with db9ce92eda68a698fdbda1c2f7a43623e4d0b29073bc429ae428d987a0d2c014 not found: ID does not exist" Mar 19 09:32:10 crc kubenswrapper[4835]: I0319 09:32:10.585066 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:32:10 crc kubenswrapper[4835]: I0319 09:32:10.597498 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-m6s8l"] Mar 19 09:32:12 crc kubenswrapper[4835]: I0319 09:32:12.408430 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" path="/var/lib/kubelet/pods/c32cf7e2-4523-41b3-b8a8-7564cca67f4a/volumes" Mar 19 09:32:13 crc kubenswrapper[4835]: I0319 09:32:13.092568 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:32:13 crc kubenswrapper[4835]: I0319 09:32:13.097383 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 09:32:34 crc kubenswrapper[4835]: I0319 09:32:34.032029 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:32:34 crc kubenswrapper[4835]: I0319 09:32:34.076885 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:32:34 crc kubenswrapper[4835]: I0319 09:32:34.299421 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.720588 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:32:45 crc kubenswrapper[4835]: E0319 09:32:45.721655 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerName="console" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.721668 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerName="console" Mar 19 09:32:45 crc kubenswrapper[4835]: E0319 09:32:45.721677 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68957ece-d303-4cd7-9e45-cc960a83a7b0" containerName="registry" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.721900 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="68957ece-d303-4cd7-9e45-cc960a83a7b0" containerName="registry" Mar 19 09:32:45 crc kubenswrapper[4835]: E0319 09:32:45.721914 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545bb00f-039b-40bc-923f-496674c4b221" containerName="oc" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.721920 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="545bb00f-039b-40bc-923f-496674c4b221" containerName="oc" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.722048 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32cf7e2-4523-41b3-b8a8-7564cca67f4a" containerName="console" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.722056 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="545bb00f-039b-40bc-923f-496674c4b221" containerName="oc" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.722070 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="68957ece-d303-4cd7-9e45-cc960a83a7b0" containerName="registry" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.722480 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.737140 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856361 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqkkx\" (UniqueName: \"kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856613 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.856679 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957546 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957604 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqkkx\" (UniqueName: \"kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957644 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.957902 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.958559 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.958689 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.959108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.959370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.963590 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.966756 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:45 crc kubenswrapper[4835]: I0319 09:32:45.975241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqkkx\" (UniqueName: \"kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx\") pod \"console-7d449f8d68-6snn9\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:46 crc kubenswrapper[4835]: I0319 09:32:46.052099 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:46 crc kubenswrapper[4835]: I0319 09:32:46.243372 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:32:46 crc kubenswrapper[4835]: I0319 09:32:46.354948 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d449f8d68-6snn9" event={"ID":"c12895ad-7a70-45e4-90ac-100694d94ca5","Type":"ContainerStarted","Data":"bfd9ffda07031c82e4818102ba2c3fefa02e1621f3309346fc6825640d3e4521"} Mar 19 09:32:47 crc kubenswrapper[4835]: I0319 09:32:47.364812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d449f8d68-6snn9" event={"ID":"c12895ad-7a70-45e4-90ac-100694d94ca5","Type":"ContainerStarted","Data":"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c"} Mar 19 09:32:47 crc kubenswrapper[4835]: I0319 09:32:47.403639 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d449f8d68-6snn9" podStartSLOduration=2.403615709 podStartE2EDuration="2.403615709s" podCreationTimestamp="2026-03-19 09:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:32:47.394789032 +0000 UTC m=+622.243387659" watchObservedRunningTime="2026-03-19 09:32:47.403615709 +0000 UTC m=+622.252214306" Mar 19 09:32:56 crc kubenswrapper[4835]: I0319 09:32:56.052935 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:56 crc kubenswrapper[4835]: I0319 09:32:56.053642 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:56 crc kubenswrapper[4835]: I0319 09:32:56.068802 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:56 crc kubenswrapper[4835]: I0319 09:32:56.435250 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:32:56 crc kubenswrapper[4835]: I0319 09:32:56.499308 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:33:06 crc kubenswrapper[4835]: I0319 09:33:06.431222 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:33:06 crc kubenswrapper[4835]: I0319 09:33:06.431923 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:33:21 crc kubenswrapper[4835]: I0319 09:33:21.548528 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-57596b75ff-6879j" podUID="eecbe809-4eb3-40d6-abe0-71a3c338b453" containerName="console" containerID="cri-o://76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920" gracePeriod=15 Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.015563 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57596b75ff-6879j_eecbe809-4eb3-40d6-abe0-71a3c338b453/console/0.log" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.015977 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.120689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.120926 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.121041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.121095 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.121139 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.121179 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.121266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2lkn\" (UniqueName: \"kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn\") pod \"eecbe809-4eb3-40d6-abe0-71a3c338b453\" (UID: \"eecbe809-4eb3-40d6-abe0-71a3c338b453\") " Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.122210 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config" (OuterVolumeSpecName: "console-config") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.122211 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca" (OuterVolumeSpecName: "service-ca") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.122272 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.122422 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.127634 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.127715 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.128985 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn" (OuterVolumeSpecName: "kube-api-access-n2lkn") pod "eecbe809-4eb3-40d6-abe0-71a3c338b453" (UID: "eecbe809-4eb3-40d6-abe0-71a3c338b453"). InnerVolumeSpecName "kube-api-access-n2lkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223545 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223598 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223617 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223636 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223651 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2lkn\" (UniqueName: \"kubernetes.io/projected/eecbe809-4eb3-40d6-abe0-71a3c338b453-kube-api-access-n2lkn\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223669 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eecbe809-4eb3-40d6-abe0-71a3c338b453-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.223684 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eecbe809-4eb3-40d6-abe0-71a3c338b453-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.624951 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57596b75ff-6879j_eecbe809-4eb3-40d6-abe0-71a3c338b453/console/0.log" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.625038 4835 generic.go:334] "Generic (PLEG): container finished" podID="eecbe809-4eb3-40d6-abe0-71a3c338b453" containerID="76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920" exitCode=2 Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.625081 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57596b75ff-6879j" event={"ID":"eecbe809-4eb3-40d6-abe0-71a3c338b453","Type":"ContainerDied","Data":"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920"} Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.625133 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57596b75ff-6879j" event={"ID":"eecbe809-4eb3-40d6-abe0-71a3c338b453","Type":"ContainerDied","Data":"58b9b4634f8b7c9d3b4ada90fd099bee5b64e488edc214afec585ae2cc99ba39"} Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.625141 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57596b75ff-6879j" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.625174 4835 scope.go:117] "RemoveContainer" containerID="76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.652175 4835 scope.go:117] "RemoveContainer" containerID="76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920" Mar 19 09:33:22 crc kubenswrapper[4835]: E0319 09:33:22.652803 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920\": container with ID starting with 76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920 not found: ID does not exist" containerID="76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.652873 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920"} err="failed to get container status \"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920\": rpc error: code = NotFound desc = could not find container \"76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920\": container with ID starting with 76d2c9b901091164b4592f7bdfaf488a23cbbbb7c155f2fd20354cd40b174920 not found: ID does not exist" Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.655126 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:33:22 crc kubenswrapper[4835]: I0319 09:33:22.663593 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57596b75ff-6879j"] Mar 19 09:33:24 crc kubenswrapper[4835]: I0319 09:33:24.417192 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eecbe809-4eb3-40d6-abe0-71a3c338b453" path="/var/lib/kubelet/pods/eecbe809-4eb3-40d6-abe0-71a3c338b453/volumes" Mar 19 09:33:26 crc kubenswrapper[4835]: I0319 09:33:26.807230 4835 scope.go:117] "RemoveContainer" containerID="78cef7c0223fab439cfa8e464d11640af709c74e290ea254dac785acc760ef48" Mar 19 09:33:26 crc kubenswrapper[4835]: I0319 09:33:26.855891 4835 scope.go:117] "RemoveContainer" containerID="e43b96a28373ef47a4d5307c6c19b78b1956bf0cdd3fb28db4161a622dd8fe28" Mar 19 09:33:36 crc kubenswrapper[4835]: I0319 09:33:36.422033 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:33:36 crc kubenswrapper[4835]: I0319 09:33:36.422826 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.129101 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565214-zs45x"] Mar 19 09:34:00 crc kubenswrapper[4835]: E0319 09:34:00.130955 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eecbe809-4eb3-40d6-abe0-71a3c338b453" containerName="console" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.131068 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="eecbe809-4eb3-40d6-abe0-71a3c338b453" containerName="console" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.131281 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="eecbe809-4eb3-40d6-abe0-71a3c338b453" containerName="console" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.131860 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.133865 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.134653 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.134817 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.142079 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565214-zs45x"] Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.166634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz6rm\" (UniqueName: \"kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm\") pod \"auto-csr-approver-29565214-zs45x\" (UID: \"15e011a6-0f30-4160-8740-591cd740d2c5\") " pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.267707 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz6rm\" (UniqueName: \"kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm\") pod \"auto-csr-approver-29565214-zs45x\" (UID: \"15e011a6-0f30-4160-8740-591cd740d2c5\") " pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.285222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz6rm\" (UniqueName: \"kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm\") pod \"auto-csr-approver-29565214-zs45x\" (UID: \"15e011a6-0f30-4160-8740-591cd740d2c5\") " pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.461902 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.658934 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565214-zs45x"] Mar 19 09:34:00 crc kubenswrapper[4835]: I0319 09:34:00.913153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565214-zs45x" event={"ID":"15e011a6-0f30-4160-8740-591cd740d2c5","Type":"ContainerStarted","Data":"fc827b7d352b3e18e06adb0e1398aa91df379f6f0370baffb24017e836690408"} Mar 19 09:34:01 crc kubenswrapper[4835]: I0319 09:34:01.919929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565214-zs45x" event={"ID":"15e011a6-0f30-4160-8740-591cd740d2c5","Type":"ContainerStarted","Data":"b91ff655aba92781f9f919767c85bc6c4185fb285487606b19627243eb45cbf4"} Mar 19 09:34:01 crc kubenswrapper[4835]: I0319 09:34:01.938773 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565214-zs45x" podStartSLOduration=1.011924131 podStartE2EDuration="1.938755239s" podCreationTimestamp="2026-03-19 09:34:00 +0000 UTC" firstStartedPulling="2026-03-19 09:34:00.681071561 +0000 UTC m=+695.529670158" lastFinishedPulling="2026-03-19 09:34:01.607902659 +0000 UTC m=+696.456501266" observedRunningTime="2026-03-19 09:34:01.935376549 +0000 UTC m=+696.783975136" watchObservedRunningTime="2026-03-19 09:34:01.938755239 +0000 UTC m=+696.787353826" Mar 19 09:34:02 crc kubenswrapper[4835]: I0319 09:34:02.931562 4835 generic.go:334] "Generic (PLEG): container finished" podID="15e011a6-0f30-4160-8740-591cd740d2c5" containerID="b91ff655aba92781f9f919767c85bc6c4185fb285487606b19627243eb45cbf4" exitCode=0 Mar 19 09:34:02 crc kubenswrapper[4835]: I0319 09:34:02.931904 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565214-zs45x" event={"ID":"15e011a6-0f30-4160-8740-591cd740d2c5","Type":"ContainerDied","Data":"b91ff655aba92781f9f919767c85bc6c4185fb285487606b19627243eb45cbf4"} Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.188587 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.248855 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz6rm\" (UniqueName: \"kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm\") pod \"15e011a6-0f30-4160-8740-591cd740d2c5\" (UID: \"15e011a6-0f30-4160-8740-591cd740d2c5\") " Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.256266 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm" (OuterVolumeSpecName: "kube-api-access-gz6rm") pod "15e011a6-0f30-4160-8740-591cd740d2c5" (UID: "15e011a6-0f30-4160-8740-591cd740d2c5"). InnerVolumeSpecName "kube-api-access-gz6rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.350249 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz6rm\" (UniqueName: \"kubernetes.io/projected/15e011a6-0f30-4160-8740-591cd740d2c5-kube-api-access-gz6rm\") on node \"crc\" DevicePath \"\"" Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.951612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565214-zs45x" event={"ID":"15e011a6-0f30-4160-8740-591cd740d2c5","Type":"ContainerDied","Data":"fc827b7d352b3e18e06adb0e1398aa91df379f6f0370baffb24017e836690408"} Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.951674 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc827b7d352b3e18e06adb0e1398aa91df379f6f0370baffb24017e836690408" Mar 19 09:34:04 crc kubenswrapper[4835]: I0319 09:34:04.951768 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565214-zs45x" Mar 19 09:34:05 crc kubenswrapper[4835]: I0319 09:34:05.007776 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565208-9qhx9"] Mar 19 09:34:05 crc kubenswrapper[4835]: I0319 09:34:05.011989 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565208-9qhx9"] Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.414825 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83197613-5a22-46ec-8a50-6d6c3228b296" path="/var/lib/kubelet/pods/83197613-5a22-46ec-8a50-6d6c3228b296/volumes" Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.422278 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.422349 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.422398 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.423102 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.423203 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d" gracePeriod=600 Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.966695 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d" exitCode=0 Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.966789 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d"} Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.967260 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5"} Mar 19 09:34:06 crc kubenswrapper[4835]: I0319 09:34:06.967288 4835 scope.go:117] "RemoveContainer" containerID="b3588ff04f6b0991e2aff452983f8dc381373f64e85dce65f122043f0fbe4294" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.190686 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq"] Mar 19 09:35:58 crc kubenswrapper[4835]: E0319 09:35:58.191522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e011a6-0f30-4160-8740-591cd740d2c5" containerName="oc" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.191538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e011a6-0f30-4160-8740-591cd740d2c5" containerName="oc" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.191694 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e011a6-0f30-4160-8740-591cd740d2c5" containerName="oc" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.192792 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.196237 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.204479 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq"] Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.391453 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.391508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fq55\" (UniqueName: \"kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.391527 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.492691 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.492798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq55\" (UniqueName: \"kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.492838 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.493308 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.493493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.510998 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fq55\" (UniqueName: \"kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.517015 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:35:58 crc kubenswrapper[4835]: I0319 09:35:58.739545 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq"] Mar 19 09:35:59 crc kubenswrapper[4835]: I0319 09:35:59.748390 4835 generic.go:334] "Generic (PLEG): container finished" podID="88674731-ebd8-49fa-b947-3143524a738c" containerID="1f41c33ec98de3b1d9224d065b9eb69c297b87e4de48baa311f7a86432dd5656" exitCode=0 Mar 19 09:35:59 crc kubenswrapper[4835]: I0319 09:35:59.748515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" event={"ID":"88674731-ebd8-49fa-b947-3143524a738c","Type":"ContainerDied","Data":"1f41c33ec98de3b1d9224d065b9eb69c297b87e4de48baa311f7a86432dd5656"} Mar 19 09:35:59 crc kubenswrapper[4835]: I0319 09:35:59.748807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" event={"ID":"88674731-ebd8-49fa-b947-3143524a738c","Type":"ContainerStarted","Data":"976f94e09945cd0e6de6ddcf17c99efd06abeab529b386dc6cd3056a994305db"} Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.133338 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565216-kbj6t"] Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.135064 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.137371 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.137685 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.137942 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.139221 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565216-kbj6t"] Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.318497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzdf\" (UniqueName: \"kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf\") pod \"auto-csr-approver-29565216-kbj6t\" (UID: \"b0fee259-324e-4077-a06f-2186e9f3d832\") " pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.419826 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzdf\" (UniqueName: \"kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf\") pod \"auto-csr-approver-29565216-kbj6t\" (UID: \"b0fee259-324e-4077-a06f-2186e9f3d832\") " pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.445367 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzdf\" (UniqueName: \"kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf\") pod \"auto-csr-approver-29565216-kbj6t\" (UID: \"b0fee259-324e-4077-a06f-2186e9f3d832\") " pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.459839 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.630254 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565216-kbj6t"] Mar 19 09:36:00 crc kubenswrapper[4835]: W0319 09:36:00.638531 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0fee259_324e_4077_a06f_2186e9f3d832.slice/crio-ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a WatchSource:0}: Error finding container ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a: Status 404 returned error can't find the container with id ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a Mar 19 09:36:00 crc kubenswrapper[4835]: I0319 09:36:00.754852 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" event={"ID":"b0fee259-324e-4077-a06f-2186e9f3d832","Type":"ContainerStarted","Data":"ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a"} Mar 19 09:36:02 crc kubenswrapper[4835]: I0319 09:36:02.772263 4835 generic.go:334] "Generic (PLEG): container finished" podID="b0fee259-324e-4077-a06f-2186e9f3d832" containerID="dff147af2c4d281fdd971dd1c755fe6a25fc191dac207d6980b0aafc1c7c7eec" exitCode=0 Mar 19 09:36:02 crc kubenswrapper[4835]: I0319 09:36:02.772805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" event={"ID":"b0fee259-324e-4077-a06f-2186e9f3d832","Type":"ContainerDied","Data":"dff147af2c4d281fdd971dd1c755fe6a25fc191dac207d6980b0aafc1c7c7eec"} Mar 19 09:36:02 crc kubenswrapper[4835]: I0319 09:36:02.776310 4835 generic.go:334] "Generic (PLEG): container finished" podID="88674731-ebd8-49fa-b947-3143524a738c" containerID="b4ccbe83ec8c7513d90f4487d3191d7661f4262ec624a97e80fedb88b7ea3192" exitCode=0 Mar 19 09:36:02 crc kubenswrapper[4835]: I0319 09:36:02.776353 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" event={"ID":"88674731-ebd8-49fa-b947-3143524a738c","Type":"ContainerDied","Data":"b4ccbe83ec8c7513d90f4487d3191d7661f4262ec624a97e80fedb88b7ea3192"} Mar 19 09:36:03 crc kubenswrapper[4835]: I0319 09:36:03.787489 4835 generic.go:334] "Generic (PLEG): container finished" podID="88674731-ebd8-49fa-b947-3143524a738c" containerID="6326819c77fb61c0b02b17321fd6898f58b61dc5bd7544a48ce872f7ef3efda9" exitCode=0 Mar 19 09:36:03 crc kubenswrapper[4835]: I0319 09:36:03.787544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" event={"ID":"88674731-ebd8-49fa-b947-3143524a738c","Type":"ContainerDied","Data":"6326819c77fb61c0b02b17321fd6898f58b61dc5bd7544a48ce872f7ef3efda9"} Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.120360 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.178230 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzdf\" (UniqueName: \"kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf\") pod \"b0fee259-324e-4077-a06f-2186e9f3d832\" (UID: \"b0fee259-324e-4077-a06f-2186e9f3d832\") " Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.184317 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf" (OuterVolumeSpecName: "kube-api-access-sjzdf") pod "b0fee259-324e-4077-a06f-2186e9f3d832" (UID: "b0fee259-324e-4077-a06f-2186e9f3d832"). InnerVolumeSpecName "kube-api-access-sjzdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.280971 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzdf\" (UniqueName: \"kubernetes.io/projected/b0fee259-324e-4077-a06f-2186e9f3d832-kube-api-access-sjzdf\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.805620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.805614 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565216-kbj6t" event={"ID":"b0fee259-324e-4077-a06f-2186e9f3d832","Type":"ContainerDied","Data":"ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a"} Mar 19 09:36:04 crc kubenswrapper[4835]: I0319 09:36:04.805803 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff75ad26e445a6bcb6cfe4db13359ee3a55711beef2331387c1ee9f6e8a8223a" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.067018 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.093544 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fq55\" (UniqueName: \"kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55\") pod \"88674731-ebd8-49fa-b947-3143524a738c\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.093664 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util\") pod \"88674731-ebd8-49fa-b947-3143524a738c\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.093694 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle\") pod \"88674731-ebd8-49fa-b947-3143524a738c\" (UID: \"88674731-ebd8-49fa-b947-3143524a738c\") " Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.096323 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle" (OuterVolumeSpecName: "bundle") pod "88674731-ebd8-49fa-b947-3143524a738c" (UID: "88674731-ebd8-49fa-b947-3143524a738c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.099082 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55" (OuterVolumeSpecName: "kube-api-access-7fq55") pod "88674731-ebd8-49fa-b947-3143524a738c" (UID: "88674731-ebd8-49fa-b947-3143524a738c"). InnerVolumeSpecName "kube-api-access-7fq55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.105351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util" (OuterVolumeSpecName: "util") pod "88674731-ebd8-49fa-b947-3143524a738c" (UID: "88674731-ebd8-49fa-b947-3143524a738c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.178113 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565210-crm8n"] Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.182762 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565210-crm8n"] Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.195995 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fq55\" (UniqueName: \"kubernetes.io/projected/88674731-ebd8-49fa-b947-3143524a738c-kube-api-access-7fq55\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.196026 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.196038 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88674731-ebd8-49fa-b947-3143524a738c-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.816998 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" event={"ID":"88674731-ebd8-49fa-b947-3143524a738c","Type":"ContainerDied","Data":"976f94e09945cd0e6de6ddcf17c99efd06abeab529b386dc6cd3056a994305db"} Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.817041 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976f94e09945cd0e6de6ddcf17c99efd06abeab529b386dc6cd3056a994305db" Mar 19 09:36:05 crc kubenswrapper[4835]: I0319 09:36:05.817101 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq" Mar 19 09:36:06 crc kubenswrapper[4835]: I0319 09:36:06.415551 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bbfdb2-82b5-40be-a7df-551fb2d03119" path="/var/lib/kubelet/pods/80bbfdb2-82b5-40be-a7df-551fb2d03119/volumes" Mar 19 09:36:06 crc kubenswrapper[4835]: I0319 09:36:06.422820 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:36:06 crc kubenswrapper[4835]: I0319 09:36:06.423131 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.230177 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qk6hn"] Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.230964 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-controller" containerID="cri-o://c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231012 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="sbdb" containerID="cri-o://fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231054 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231096 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-node" containerID="cri-o://796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231144 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-acl-logging" containerID="cri-o://63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231229 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="nbdb" containerID="cri-o://c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.231271 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="northd" containerID="cri-o://0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.263906 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" containerID="cri-o://f7beea1efc6f7ae2f60e690b578a0dc2d1eeda3a71590efa41f1df002699cf3c" gracePeriod=30 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.852221 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/2.log" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.853035 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/1.log" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.853091 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ee35aaa-2819-432a-af95-f1078ad836fc" containerID="e9fa86028b1f00abb5fe0bde1d574b1a2d6aae726bddff744f4b1ac500cc935d" exitCode=2 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.853174 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerDied","Data":"e9fa86028b1f00abb5fe0bde1d574b1a2d6aae726bddff744f4b1ac500cc935d"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.853212 4835 scope.go:117] "RemoveContainer" containerID="db21550c872812553e0b8009562ceb73b976b7505db3b7f995e014e7c5ec7124" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.853810 4835 scope.go:117] "RemoveContainer" containerID="e9fa86028b1f00abb5fe0bde1d574b1a2d6aae726bddff744f4b1ac500cc935d" Mar 19 09:36:09 crc kubenswrapper[4835]: E0319 09:36:09.854041 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jl5x4_openshift-multus(4ee35aaa-2819-432a-af95-f1078ad836fc)\"" pod="openshift-multus/multus-jl5x4" podUID="4ee35aaa-2819-432a-af95-f1078ad836fc" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.860968 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovnkube-controller/3.log" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.864078 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-acl-logging/0.log" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.865581 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-controller/0.log" Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866113 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="f7beea1efc6f7ae2f60e690b578a0dc2d1eeda3a71590efa41f1df002699cf3c" exitCode=0 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866222 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852" exitCode=0 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866310 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0" exitCode=0 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866385 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0" exitCode=0 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866473 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c" exitCode=143 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866555 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd" exitCode=143 Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866664 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"f7beea1efc6f7ae2f60e690b578a0dc2d1eeda3a71590efa41f1df002699cf3c"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866800 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866899 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.866993 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.867074 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.867152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd"} Mar 19 09:36:09 crc kubenswrapper[4835]: I0319 09:36:09.884534 4835 scope.go:117] "RemoveContainer" containerID="82a0b8e26d19af43ca8c4fa57d8add7c4c786e843bddb91f8cf39dad321bcd0a" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.880600 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-acl-logging/0.log" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881378 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-controller/0.log" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881799 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7" exitCode=0 Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881819 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerID="796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183" exitCode=0 Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881869 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7"} Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183"} Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881901 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" event={"ID":"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b","Type":"ContainerDied","Data":"7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9"} Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.881910 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ccee33ef15359626a0e9e795e0e1ffb358cfb7dc8834a42d6870e63f5411bb9" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.883413 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/2.log" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.928264 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-acl-logging/0.log" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.928849 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-controller/0.log" Mar 19 09:36:10 crc kubenswrapper[4835]: I0319 09:36:10.929166 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.031788 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4jp9w"] Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032231 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.032311 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032401 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.032459 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032518 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="sbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.032582 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="sbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032662 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kubecfg-setup" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.032721 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kubecfg-setup" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032800 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-acl-logging" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.032863 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-acl-logging" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.032944 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="northd" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033006 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="northd" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033079 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="nbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033142 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="nbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033208 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033268 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033335 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033397 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033466 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="util" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033556 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="util" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033630 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fee259-324e-4077-a06f-2186e9f3d832" containerName="oc" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033701 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fee259-324e-4077-a06f-2186e9f3d832" containerName="oc" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033789 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-node" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.033871 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-node" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.033945 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="pull" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034021 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="pull" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.034089 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="extract" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034149 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="extract" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.034213 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034270 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034438 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034519 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034592 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fee259-324e-4077-a06f-2186e9f3d832" containerName="oc" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034662 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034723 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="sbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034813 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-acl-logging" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.034884 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035019 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="kube-rbac-proxy-node" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035086 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="nbdb" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035147 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="northd" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035209 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovn-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035269 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035329 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="88674731-ebd8-49fa-b947-3143524a738c" containerName="extract" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.035516 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035585 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: E0319 09:36:11.035652 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035714 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.035911 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" containerName="ovnkube-controller" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.038082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080602 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080633 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080786 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080822 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080865 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080897 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080920 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080941 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080963 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.080981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081024 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081057 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081080 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081105 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081135 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081164 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48r2z\" (UniqueName: \"kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z\") pod \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\" (UID: \"2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b\") " Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081556 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081596 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081615 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081659 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket" (OuterVolumeSpecName: "log-socket") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081649 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081666 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log" (OuterVolumeSpecName: "node-log") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081676 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash" (OuterVolumeSpecName: "host-slash") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081707 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081731 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.081724 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.082186 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.082441 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.082734 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.082772 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.082811 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.107641 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.113386 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z" (OuterVolumeSpecName: "kube-api-access-48r2z") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "kube-api-access-48r2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.121088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" (UID: "2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182876 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-kubelet\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182914 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182933 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-bin\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182946 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-systemd-units\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182970 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-etc-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.182990 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-env-overrides\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-ovn\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183043 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-netns\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovn-node-metrics-cert\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-config\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183096 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-node-log\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183115 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-log-socket\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183139 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whncj\" (UniqueName: \"kubernetes.io/projected/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-kube-api-access-whncj\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183154 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-var-lib-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-netd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183184 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183197 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-script-lib\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183213 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-systemd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183227 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-slash\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183274 4835 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183284 4835 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183294 4835 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183301 4835 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183310 4835 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183318 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183325 4835 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183333 4835 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183340 4835 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183348 4835 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183356 4835 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183363 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183371 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48r2z\" (UniqueName: \"kubernetes.io/projected/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-kube-api-access-48r2z\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183379 4835 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183387 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183395 4835 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183403 4835 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183412 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183421 4835 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.183429 4835 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284347 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-var-lib-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284666 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whncj\" (UniqueName: \"kubernetes.io/projected/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-kube-api-access-whncj\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284685 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-netd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284721 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-systemd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284735 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-script-lib\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-slash\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-kubelet\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284857 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-bin\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-systemd-units\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284896 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-etc-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284938 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-env-overrides\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284954 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-ovn\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284971 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-netns\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.284989 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovn-node-metrics-cert\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-config\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-node-log\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-log-socket\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-log-socket\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285143 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-var-lib-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285397 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-netd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285422 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.285443 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-systemd\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286066 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-script-lib\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-slash\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286131 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-kubelet\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-cni-bin\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286191 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-systemd-units\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286210 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-etc-openvswitch\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286230 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286525 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-env-overrides\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-run-ovn\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.286595 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-host-run-netns\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.287131 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-node-log\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.287470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovnkube-config\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.290650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-ovn-node-metrics-cert\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.302368 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whncj\" (UniqueName: \"kubernetes.io/projected/f7e81e94-e1a9-47df-86b2-91cdfd01ae1e-kube-api-access-whncj\") pod \"ovnkube-node-4jp9w\" (UID: \"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e\") " pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.353493 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.889614 4835 generic.go:334] "Generic (PLEG): container finished" podID="f7e81e94-e1a9-47df-86b2-91cdfd01ae1e" containerID="82726cd7bc3a894c15111202a476982f824d84a29d18e390de06753f58f26c48" exitCode=0 Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.889711 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerDied","Data":"82726cd7bc3a894c15111202a476982f824d84a29d18e390de06753f58f26c48"} Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.889794 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"33ad3e71840bcfe8af990bba29e937d855190915948ab1ebb24f754f9a039a75"} Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.889731 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qk6hn" Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.946393 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qk6hn"] Mar 19 09:36:11 crc kubenswrapper[4835]: I0319 09:36:11.964658 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qk6hn"] Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.408123 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b" path="/var/lib/kubelet/pods/2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/volumes" Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"6fd734d88d0d054d6ec44772f1993f057bd227d43fe588a4b7a3a67930fe0902"} Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"73faca8172e57ec943a8a78afb371bb1e80c2256857a0c4d8aa10b9039c09978"} Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"b50188f0a343846827c9a3b454ce812eed69e7a5e59e3f75f8fda8c68a78933a"} Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"d3f2e1f40b49da536480c76bd9f07609062e27607d5b72f76dc734f8639033cc"} Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"cd39ca492e3e152dade914721ccd748bf2a55f3315889211001331163c69f64e"} Mar 19 09:36:12 crc kubenswrapper[4835]: I0319 09:36:12.903454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"dcda569f771d44d6c0ca2e266792d5c32e82a411cfe1c3f6e835c1545ccb6e0f"} Mar 19 09:36:15 crc kubenswrapper[4835]: I0319 09:36:15.924671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"27128d6e9f9616d16f2a79069c1d89d882d08b58633648da647115241c077688"} Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.688120 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh"] Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.689236 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.691410 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.691533 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.691636 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-2fmxj" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.868634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrwb\" (UniqueName: \"kubernetes.io/projected/210e8b64-36bc-4abd-8620-43093946855b-kube-api-access-kdrwb\") pod \"obo-prometheus-operator-8ff7d675-mbzhh\" (UID: \"210e8b64-36bc-4abd-8620-43093946855b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.940433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" event={"ID":"f7e81e94-e1a9-47df-86b2-91cdfd01ae1e","Type":"ContainerStarted","Data":"de5c997dbba849e3fb31f79bc9bf5a678af758f2bc71f14eb186b32dac97f203"} Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.940730 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.969953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrwb\" (UniqueName: \"kubernetes.io/projected/210e8b64-36bc-4abd-8620-43093946855b-kube-api-access-kdrwb\") pod \"obo-prometheus-operator-8ff7d675-mbzhh\" (UID: \"210e8b64-36bc-4abd-8620-43093946855b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.978472 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" podStartSLOduration=6.978450913 podStartE2EDuration="6.978450913s" podCreationTimestamp="2026-03-19 09:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:36:17.976066788 +0000 UTC m=+832.824665385" watchObservedRunningTime="2026-03-19 09:36:17.978450913 +0000 UTC m=+832.827049500" Mar 19 09:36:17 crc kubenswrapper[4835]: I0319 09:36:17.993973 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.010820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrwb\" (UniqueName: \"kubernetes.io/projected/210e8b64-36bc-4abd-8620-43093946855b-kube-api-access-kdrwb\") pod \"obo-prometheus-operator-8ff7d675-mbzhh\" (UID: \"210e8b64-36bc-4abd-8620-43093946855b\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.218549 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.219309 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.223602 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.225150 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-s2wwv" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.231701 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.232438 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.303031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.332195 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(6eaab2f4b8e5418b7b84124ebbc38eae50925bbaaf2a144173daf727171f9cfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.332267 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(6eaab2f4b8e5418b7b84124ebbc38eae50925bbaaf2a144173daf727171f9cfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.332289 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(6eaab2f4b8e5418b7b84124ebbc38eae50925bbaaf2a144173daf727171f9cfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.332331 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators(210e8b64-36bc-4abd-8620-43093946855b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators(210e8b64-36bc-4abd-8620-43093946855b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(6eaab2f4b8e5418b7b84124ebbc38eae50925bbaaf2a144173daf727171f9cfd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" podUID="210e8b64-36bc-4abd-8620-43093946855b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.397391 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.397451 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.397489 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.397509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.498521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.498571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.498648 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.499174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.505469 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.505815 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.505885 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f26c226a-c8fa-4ff7-bda6-7eece297dd86-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-l992d\" (UID: \"f26c226a-c8fa-4ff7-bda6-7eece297dd86\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.508166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73f088be-6bad-4a18-8353-10475ad7105d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b\" (UID: \"73f088be-6bad-4a18-8353-10475ad7105d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.535167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.545694 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.562439 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(80363ed8f0932d3b73126e00d3dd4ecdabae69f481473b169110a8bf1924342b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.562508 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(80363ed8f0932d3b73126e00d3dd4ecdabae69f481473b169110a8bf1924342b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.562529 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(80363ed8f0932d3b73126e00d3dd4ecdabae69f481473b169110a8bf1924342b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.562566 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators(73f088be-6bad-4a18-8353-10475ad7105d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators(73f088be-6bad-4a18-8353-10475ad7105d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(80363ed8f0932d3b73126e00d3dd4ecdabae69f481473b169110a8bf1924342b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" podUID="73f088be-6bad-4a18-8353-10475ad7105d" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.571441 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(c4fc1c22ff5807da8ce8fb3f825fa08451404de1f5f4ab58b4fa39a351e0b089): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.571492 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(c4fc1c22ff5807da8ce8fb3f825fa08451404de1f5f4ab58b4fa39a351e0b089): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.571509 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(c4fc1c22ff5807da8ce8fb3f825fa08451404de1f5f4ab58b4fa39a351e0b089): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: E0319 09:36:18.571546 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators(f26c226a-c8fa-4ff7-bda6-7eece297dd86)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators(f26c226a-c8fa-4ff7-bda6-7eece297dd86)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(c4fc1c22ff5807da8ce8fb3f825fa08451404de1f5f4ab58b4fa39a351e0b089): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" podUID="f26c226a-c8fa-4ff7-bda6-7eece297dd86" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.620078 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.629130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.642597 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.692799 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-qsz5n"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.693758 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.705525 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-qfmmr" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.706041 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.710502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-qsz5n"] Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.802716 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bkf\" (UniqueName: \"kubernetes.io/projected/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-kube-api-access-c2bkf\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.803035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.904647 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.904799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bkf\" (UniqueName: \"kubernetes.io/projected/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-kube-api-access-c2bkf\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.910332 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.921005 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bkf\" (UniqueName: \"kubernetes.io/projected/95fe9c35-69f6-4b60-a725-c2f0d8a34c99-kube-api-access-c2bkf\") pod \"observability-operator-6dd7dd855f-qsz5n\" (UID: \"95fe9c35-69f6-4b60-a725-c2f0d8a34c99\") " pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946001 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946452 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946561 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946603 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946695 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.946892 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.951611 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.951951 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:18 crc kubenswrapper[4835]: I0319 09:36:18.996417 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.006636 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(570c6379ed1e7d4ecc625fa6b6995c277e04585426e432bd5b4fdf63a5c01118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.006684 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(570c6379ed1e7d4ecc625fa6b6995c277e04585426e432bd5b4fdf63a5c01118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.006707 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(570c6379ed1e7d4ecc625fa6b6995c277e04585426e432bd5b4fdf63a5c01118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.006770 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators(210e8b64-36bc-4abd-8620-43093946855b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators(210e8b64-36bc-4abd-8620-43093946855b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-mbzhh_openshift-operators_210e8b64-36bc-4abd-8620-43093946855b_0(570c6379ed1e7d4ecc625fa6b6995c277e04585426e432bd5b4fdf63a5c01118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" podUID="210e8b64-36bc-4abd-8620-43093946855b" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.008635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.030071 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54476d58cc-x6mtx"] Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.030817 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.034236 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.034516 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-w9ml4" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.055298 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(e7aa2b5288d5a7ef4548440c24c6fff0df2e1cd7aadddacdb733a9f9e1de706b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.055346 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(e7aa2b5288d5a7ef4548440c24c6fff0df2e1cd7aadddacdb733a9f9e1de706b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.055368 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(e7aa2b5288d5a7ef4548440c24c6fff0df2e1cd7aadddacdb733a9f9e1de706b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.055617 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators(73f088be-6bad-4a18-8353-10475ad7105d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators(73f088be-6bad-4a18-8353-10475ad7105d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_openshift-operators_73f088be-6bad-4a18-8353-10475ad7105d_0(e7aa2b5288d5a7ef4548440c24c6fff0df2e1cd7aadddacdb733a9f9e1de706b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" podUID="73f088be-6bad-4a18-8353-10475ad7105d" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.095829 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(1f638898402a961c70f3241fd71d8e56ce74734a239dcb8de3823215c8805866): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.095906 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(1f638898402a961c70f3241fd71d8e56ce74734a239dcb8de3823215c8805866): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.095933 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(1f638898402a961c70f3241fd71d8e56ce74734a239dcb8de3823215c8805866): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.095989 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-qsz5n_openshift-operators(95fe9c35-69f6-4b60-a725-c2f0d8a34c99)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-qsz5n_openshift-operators(95fe9c35-69f6-4b60-a725-c2f0d8a34c99)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(1f638898402a961c70f3241fd71d8e56ce74734a239dcb8de3823215c8805866): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.108402 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/41d1090f-7ab4-4820-a742-dca791692d0f-openshift-service-ca\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.108509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-apiservice-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.108542 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-webhook-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.108582 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlr7\" (UniqueName: \"kubernetes.io/projected/41d1090f-7ab4-4820-a742-dca791692d0f-kube-api-access-drlr7\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.109696 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54476d58cc-x6mtx"] Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.121465 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(89af1deaae626abc29267438c391559f3a9e2e28b4a34742f6c4e1fe0e0f2ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.121590 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(89af1deaae626abc29267438c391559f3a9e2e28b4a34742f6c4e1fe0e0f2ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.121661 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(89af1deaae626abc29267438c391559f3a9e2e28b4a34742f6c4e1fe0e0f2ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.121769 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators(f26c226a-c8fa-4ff7-bda6-7eece297dd86)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators(f26c226a-c8fa-4ff7-bda6-7eece297dd86)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-c467569cb-l992d_openshift-operators_f26c226a-c8fa-4ff7-bda6-7eece297dd86_0(89af1deaae626abc29267438c391559f3a9e2e28b4a34742f6c4e1fe0e0f2ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" podUID="f26c226a-c8fa-4ff7-bda6-7eece297dd86" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.209645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-apiservice-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.209709 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-webhook-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.209783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlr7\" (UniqueName: \"kubernetes.io/projected/41d1090f-7ab4-4820-a742-dca791692d0f-kube-api-access-drlr7\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.209816 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/41d1090f-7ab4-4820-a742-dca791692d0f-openshift-service-ca\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.210834 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/41d1090f-7ab4-4820-a742-dca791692d0f-openshift-service-ca\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.215188 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-apiservice-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.215918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41d1090f-7ab4-4820-a742-dca791692d0f-webhook-cert\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.244377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlr7\" (UniqueName: \"kubernetes.io/projected/41d1090f-7ab4-4820-a742-dca791692d0f-kube-api-access-drlr7\") pod \"perses-operator-54476d58cc-x6mtx\" (UID: \"41d1090f-7ab4-4820-a742-dca791692d0f\") " pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.352361 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.376456 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(b32ef4b32456b80a58828e399e245362ce30f1bab6394f283b6e2d625b41091d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.376826 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(b32ef4b32456b80a58828e399e245362ce30f1bab6394f283b6e2d625b41091d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.376913 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(b32ef4b32456b80a58828e399e245362ce30f1bab6394f283b6e2d625b41091d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.377040 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54476d58cc-x6mtx_openshift-operators(41d1090f-7ab4-4820-a742-dca791692d0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54476d58cc-x6mtx_openshift-operators(41d1090f-7ab4-4820-a742-dca791692d0f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(b32ef4b32456b80a58828e399e245362ce30f1bab6394f283b6e2d625b41091d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.949951 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.949988 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.950365 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:19 crc kubenswrapper[4835]: I0319 09:36:19.950666 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.988402 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(0e50c67801d9ab16965ca5bbd4bd2469696ed26ffe7748dda7b868699452fd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.988478 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(0e50c67801d9ab16965ca5bbd4bd2469696ed26ffe7748dda7b868699452fd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.988512 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(0e50c67801d9ab16965ca5bbd4bd2469696ed26ffe7748dda7b868699452fd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:19 crc kubenswrapper[4835]: E0319 09:36:19.988559 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-qsz5n_openshift-operators(95fe9c35-69f6-4b60-a725-c2f0d8a34c99)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-qsz5n_openshift-operators(95fe9c35-69f6-4b60-a725-c2f0d8a34c99)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-qsz5n_openshift-operators_95fe9c35-69f6-4b60-a725-c2f0d8a34c99_0(0e50c67801d9ab16965ca5bbd4bd2469696ed26ffe7748dda7b868699452fd05): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" Mar 19 09:36:20 crc kubenswrapper[4835]: E0319 09:36:19.999972 4835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(274f7dffc9886a01625badaa2b67d971708e324d07755625ca7a61e45782b9af): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 09:36:20 crc kubenswrapper[4835]: E0319 09:36:20.000033 4835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(274f7dffc9886a01625badaa2b67d971708e324d07755625ca7a61e45782b9af): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:20 crc kubenswrapper[4835]: E0319 09:36:20.000059 4835 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(274f7dffc9886a01625badaa2b67d971708e324d07755625ca7a61e45782b9af): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:20 crc kubenswrapper[4835]: E0319 09:36:20.000111 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-54476d58cc-x6mtx_openshift-operators(41d1090f-7ab4-4820-a742-dca791692d0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-54476d58cc-x6mtx_openshift-operators(41d1090f-7ab4-4820-a742-dca791692d0f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-54476d58cc-x6mtx_openshift-operators_41d1090f-7ab4-4820-a742-dca791692d0f_0(274f7dffc9886a01625badaa2b67d971708e324d07755625ca7a61e45782b9af): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" Mar 19 09:36:22 crc kubenswrapper[4835]: I0319 09:36:22.402258 4835 scope.go:117] "RemoveContainer" containerID="e9fa86028b1f00abb5fe0bde1d574b1a2d6aae726bddff744f4b1ac500cc935d" Mar 19 09:36:22 crc kubenswrapper[4835]: I0319 09:36:22.969589 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jl5x4_4ee35aaa-2819-432a-af95-f1078ad836fc/kube-multus/2.log" Mar 19 09:36:22 crc kubenswrapper[4835]: I0319 09:36:22.970027 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jl5x4" event={"ID":"4ee35aaa-2819-432a-af95-f1078ad836fc","Type":"ContainerStarted","Data":"96769bd13eb8203aefc7373e2d9f18fffc437b97e77a2f4129d037de7f1bdfc9"} Mar 19 09:36:26 crc kubenswrapper[4835]: I0319 09:36:26.961278 4835 scope.go:117] "RemoveContainer" containerID="c9700b6f756c49fd4c80814e7ec8b3968b3a1a4a060af3366cfbcde9afe056d0" Mar 19 09:36:26 crc kubenswrapper[4835]: I0319 09:36:26.979068 4835 scope.go:117] "RemoveContainer" containerID="796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183" Mar 19 09:36:26 crc kubenswrapper[4835]: I0319 09:36:26.993158 4835 scope.go:117] "RemoveContainer" containerID="d46be6ef8ac3fa0e2aeeff1fb29f9aeb82b6ffed9557e4acb6ce29be721d314f" Mar 19 09:36:26 crc kubenswrapper[4835]: E0319 09:36:26.994658 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183\": container with ID starting with 796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183 not found: ID does not exist" containerID="796b2dcca72564cdb668d41bb2a3de94b9a051b6d3cc6cff1abf790a290f2183" Mar 19 09:36:26 crc kubenswrapper[4835]: I0319 09:36:26.995261 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-acl-logging/0.log" Mar 19 09:36:26 crc kubenswrapper[4835]: I0319 09:36:26.995694 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qk6hn_2e27fcad-b6f8-4ce8-9f2b-e112f8ae138b/ovn-controller/0.log" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.052497 4835 scope.go:117] "RemoveContainer" containerID="f7beea1efc6f7ae2f60e690b578a0dc2d1eeda3a71590efa41f1df002699cf3c" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.077285 4835 scope.go:117] "RemoveContainer" containerID="c8db88b60673fb888e49ff1dcb9416a59ad4524a2a763f1be911053ceca827fd" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.091845 4835 scope.go:117] "RemoveContainer" containerID="8df0c2ec899dfefd835123bae49121907eb0ef122be26a92587f324e9cd8ada7" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.107313 4835 scope.go:117] "RemoveContainer" containerID="b1c96a9fe91a913dd283c69748d721f2e56bb5d66e7a8b9cec126b7f24d55a6a" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.138464 4835 scope.go:117] "RemoveContainer" containerID="30d47a6b8406ac16d60bcc10c7e20efa6771c67100f3c8357c5a9a2794ccdcba" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.181173 4835 scope.go:117] "RemoveContainer" containerID="fcb093569be5a825491f56bf1b08d66fe7fed5c8e244231dd7f79005cbe0c852" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.196912 4835 scope.go:117] "RemoveContainer" containerID="0f876fbdfacfa3d71e2cb5481c3cc94d66376d60260a5ec90e6700979a76b0a0" Mar 19 09:36:27 crc kubenswrapper[4835]: I0319 09:36:27.215753 4835 scope.go:117] "RemoveContainer" containerID="63e7dc755b5798310eac36921fcd00df32b34ba4b6088b84297f5966f6a1bb2c" Mar 19 09:36:30 crc kubenswrapper[4835]: I0319 09:36:30.401112 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:30 crc kubenswrapper[4835]: I0319 09:36:30.402152 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" Mar 19 09:36:30 crc kubenswrapper[4835]: I0319 09:36:30.629732 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d"] Mar 19 09:36:31 crc kubenswrapper[4835]: I0319 09:36:31.018610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" event={"ID":"f26c226a-c8fa-4ff7-bda6-7eece297dd86","Type":"ContainerStarted","Data":"4837e007563ee9ed9471664dbdd5d876906ebedb56243ed58f4c9638063c6734"} Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.401942 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.402211 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.402823 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.402869 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.955885 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh"] Mar 19 09:36:32 crc kubenswrapper[4835]: I0319 09:36:32.962365 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b"] Mar 19 09:36:32 crc kubenswrapper[4835]: W0319 09:36:32.972791 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210e8b64_36bc_4abd_8620_43093946855b.slice/crio-eab2e88b39f9aaf99d237b659fb19544f21e6792b49f19474a3678d54fc94752 WatchSource:0}: Error finding container eab2e88b39f9aaf99d237b659fb19544f21e6792b49f19474a3678d54fc94752: Status 404 returned error can't find the container with id eab2e88b39f9aaf99d237b659fb19544f21e6792b49f19474a3678d54fc94752 Mar 19 09:36:32 crc kubenswrapper[4835]: W0319 09:36:32.980100 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73f088be_6bad_4a18_8353_10475ad7105d.slice/crio-1b61d358626bb3b401dd78e76833b65572207b6918ee406e6bba05b0d06053b0 WatchSource:0}: Error finding container 1b61d358626bb3b401dd78e76833b65572207b6918ee406e6bba05b0d06053b0: Status 404 returned error can't find the container with id 1b61d358626bb3b401dd78e76833b65572207b6918ee406e6bba05b0d06053b0 Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.037613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" event={"ID":"73f088be-6bad-4a18-8353-10475ad7105d","Type":"ContainerStarted","Data":"1b61d358626bb3b401dd78e76833b65572207b6918ee406e6bba05b0d06053b0"} Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.042348 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" event={"ID":"210e8b64-36bc-4abd-8620-43093946855b","Type":"ContainerStarted","Data":"eab2e88b39f9aaf99d237b659fb19544f21e6792b49f19474a3678d54fc94752"} Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.401678 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.402145 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.753928 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54476d58cc-x6mtx"] Mar 19 09:36:33 crc kubenswrapper[4835]: I0319 09:36:33.800869 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:36:34 crc kubenswrapper[4835]: I0319 09:36:34.062842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" event={"ID":"41d1090f-7ab4-4820-a742-dca791692d0f","Type":"ContainerStarted","Data":"dd17de1beca7f13f042f968edfbd8a9586125c3d57b5abf10aabe2a894ef1c40"} Mar 19 09:36:35 crc kubenswrapper[4835]: I0319 09:36:35.402231 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:35 crc kubenswrapper[4835]: I0319 09:36:35.402423 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:36 crc kubenswrapper[4835]: I0319 09:36:36.428268 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:36:36 crc kubenswrapper[4835]: I0319 09:36:36.429122 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:36:37 crc kubenswrapper[4835]: I0319 09:36:37.067659 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-qsz5n"] Mar 19 09:36:37 crc kubenswrapper[4835]: I0319 09:36:37.122816 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" event={"ID":"73f088be-6bad-4a18-8353-10475ad7105d","Type":"ContainerStarted","Data":"4ed560cc508e1eab8ec7652e7fdb4c2e67b05226672e9fa7ea3bfced98d0ef54"} Mar 19 09:36:37 crc kubenswrapper[4835]: I0319 09:36:37.143169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" event={"ID":"f26c226a-c8fa-4ff7-bda6-7eece297dd86","Type":"ContainerStarted","Data":"c04c082d963573e693e1affaad28993e7f3ba844fff1f88be30f72913e6c7128"} Mar 19 09:36:37 crc kubenswrapper[4835]: I0319 09:36:37.152236 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-4zx5b" podStartSLOduration=15.408726565 podStartE2EDuration="19.152219938s" podCreationTimestamp="2026-03-19 09:36:18 +0000 UTC" firstStartedPulling="2026-03-19 09:36:32.987799698 +0000 UTC m=+847.836398285" lastFinishedPulling="2026-03-19 09:36:36.731293071 +0000 UTC m=+851.579891658" observedRunningTime="2026-03-19 09:36:37.151512968 +0000 UTC m=+852.000111565" watchObservedRunningTime="2026-03-19 09:36:37.152219938 +0000 UTC m=+852.000818525" Mar 19 09:36:37 crc kubenswrapper[4835]: I0319 09:36:37.203582 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-c467569cb-l992d" podStartSLOduration=13.138972438 podStartE2EDuration="19.203563092s" podCreationTimestamp="2026-03-19 09:36:18 +0000 UTC" firstStartedPulling="2026-03-19 09:36:30.642615185 +0000 UTC m=+845.491213772" lastFinishedPulling="2026-03-19 09:36:36.707205839 +0000 UTC m=+851.555804426" observedRunningTime="2026-03-19 09:36:37.193671058 +0000 UTC m=+852.042269665" watchObservedRunningTime="2026-03-19 09:36:37.203563092 +0000 UTC m=+852.052161679" Mar 19 09:36:38 crc kubenswrapper[4835]: W0319 09:36:38.018549 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95fe9c35_69f6_4b60_a725_c2f0d8a34c99.slice/crio-3160b6f7237988f4bf4494fc878e64c51361cfb216f3eb4ae6e04664ec6b1eb9 WatchSource:0}: Error finding container 3160b6f7237988f4bf4494fc878e64c51361cfb216f3eb4ae6e04664ec6b1eb9: Status 404 returned error can't find the container with id 3160b6f7237988f4bf4494fc878e64c51361cfb216f3eb4ae6e04664ec6b1eb9 Mar 19 09:36:38 crc kubenswrapper[4835]: I0319 09:36:38.151022 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" event={"ID":"95fe9c35-69f6-4b60-a725-c2f0d8a34c99","Type":"ContainerStarted","Data":"3160b6f7237988f4bf4494fc878e64c51361cfb216f3eb4ae6e04664ec6b1eb9"} Mar 19 09:36:39 crc kubenswrapper[4835]: I0319 09:36:39.157526 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" event={"ID":"41d1090f-7ab4-4820-a742-dca791692d0f","Type":"ContainerStarted","Data":"deea10ce425b259a93ad497237b4fa11de1972aead7a53284398772effada227"} Mar 19 09:36:39 crc kubenswrapper[4835]: I0319 09:36:39.157931 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:39 crc kubenswrapper[4835]: I0319 09:36:39.180483 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podStartSLOduration=16.870021475 podStartE2EDuration="21.180447148s" podCreationTimestamp="2026-03-19 09:36:18 +0000 UTC" firstStartedPulling="2026-03-19 09:36:33.800536925 +0000 UTC m=+848.649135522" lastFinishedPulling="2026-03-19 09:36:38.110962608 +0000 UTC m=+852.959561195" observedRunningTime="2026-03-19 09:36:39.178884974 +0000 UTC m=+854.027483561" watchObservedRunningTime="2026-03-19 09:36:39.180447148 +0000 UTC m=+854.029045735" Mar 19 09:36:40 crc kubenswrapper[4835]: I0319 09:36:40.164251 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" event={"ID":"210e8b64-36bc-4abd-8620-43093946855b","Type":"ContainerStarted","Data":"dd2113703069e246521889bbcfd847dc54afcd06c5672d7651f4330c15120c64"} Mar 19 09:36:40 crc kubenswrapper[4835]: I0319 09:36:40.185464 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-mbzhh" podStartSLOduration=16.576028778 podStartE2EDuration="23.185441996s" podCreationTimestamp="2026-03-19 09:36:17 +0000 UTC" firstStartedPulling="2026-03-19 09:36:32.977934254 +0000 UTC m=+847.826532841" lastFinishedPulling="2026-03-19 09:36:39.587347472 +0000 UTC m=+854.435946059" observedRunningTime="2026-03-19 09:36:40.179902837 +0000 UTC m=+855.028501434" watchObservedRunningTime="2026-03-19 09:36:40.185441996 +0000 UTC m=+855.034040583" Mar 19 09:36:41 crc kubenswrapper[4835]: I0319 09:36:41.381463 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" Mar 19 09:36:44 crc kubenswrapper[4835]: I0319 09:36:44.191551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" event={"ID":"95fe9c35-69f6-4b60-a725-c2f0d8a34c99","Type":"ContainerStarted","Data":"18e2ca242ad34c1140edf13b90a427133c3ebb9b0d73b6c4d2b51b8adcc39204"} Mar 19 09:36:44 crc kubenswrapper[4835]: I0319 09:36:44.192021 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:44 crc kubenswrapper[4835]: I0319 09:36:44.194022 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 09:36:44 crc kubenswrapper[4835]: I0319 09:36:44.217424 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podStartSLOduration=20.925356074 podStartE2EDuration="26.217408523s" podCreationTimestamp="2026-03-19 09:36:18 +0000 UTC" firstStartedPulling="2026-03-19 09:36:38.027235523 +0000 UTC m=+852.875834111" lastFinishedPulling="2026-03-19 09:36:43.319287973 +0000 UTC m=+858.167886560" observedRunningTime="2026-03-19 09:36:44.21489003 +0000 UTC m=+859.063488617" watchObservedRunningTime="2026-03-19 09:36:44.217408523 +0000 UTC m=+859.066007110" Mar 19 09:36:44 crc kubenswrapper[4835]: I0319 09:36:44.883274 4835 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 09:36:49 crc kubenswrapper[4835]: I0319 09:36:49.356939 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.563036 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-s8pcl"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.564235 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s8pcl" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.568897 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.568899 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.569201 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-csjjw" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.570043 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.570888 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.572200 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vrd47" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.575866 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s8pcl"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.592455 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rd6zz"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.593273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.600945 4835 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jw4v9" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.614025 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.626821 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rd6zz"] Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.631052 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4zv\" (UniqueName: \"kubernetes.io/projected/f0524d82-6920-4648-a4a8-d25a5c5a75f3-kube-api-access-wm4zv\") pod \"cert-manager-858654f9db-s8pcl\" (UID: \"f0524d82-6920-4648-a4a8-d25a5c5a75f3\") " pod="cert-manager/cert-manager-858654f9db-s8pcl" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.631126 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492rd\" (UniqueName: \"kubernetes.io/projected/e4392ff6-094c-408b-aa06-c6b6133c9941-kube-api-access-492rd\") pod \"cert-manager-cainjector-cf98fcc89-8pv2v\" (UID: \"e4392ff6-094c-408b-aa06-c6b6133c9941\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.631164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96w28\" (UniqueName: \"kubernetes.io/projected/3cb1a290-5e83-4b84-80ab-a7d86e85ce98-kube-api-access-96w28\") pod \"cert-manager-webhook-687f57d79b-rd6zz\" (UID: \"3cb1a290-5e83-4b84-80ab-a7d86e85ce98\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.732704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96w28\" (UniqueName: \"kubernetes.io/projected/3cb1a290-5e83-4b84-80ab-a7d86e85ce98-kube-api-access-96w28\") pod \"cert-manager-webhook-687f57d79b-rd6zz\" (UID: \"3cb1a290-5e83-4b84-80ab-a7d86e85ce98\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.732809 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4zv\" (UniqueName: \"kubernetes.io/projected/f0524d82-6920-4648-a4a8-d25a5c5a75f3-kube-api-access-wm4zv\") pod \"cert-manager-858654f9db-s8pcl\" (UID: \"f0524d82-6920-4648-a4a8-d25a5c5a75f3\") " pod="cert-manager/cert-manager-858654f9db-s8pcl" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.732887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492rd\" (UniqueName: \"kubernetes.io/projected/e4392ff6-094c-408b-aa06-c6b6133c9941-kube-api-access-492rd\") pod \"cert-manager-cainjector-cf98fcc89-8pv2v\" (UID: \"e4392ff6-094c-408b-aa06-c6b6133c9941\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.754209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96w28\" (UniqueName: \"kubernetes.io/projected/3cb1a290-5e83-4b84-80ab-a7d86e85ce98-kube-api-access-96w28\") pod \"cert-manager-webhook-687f57d79b-rd6zz\" (UID: \"3cb1a290-5e83-4b84-80ab-a7d86e85ce98\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.754737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4zv\" (UniqueName: \"kubernetes.io/projected/f0524d82-6920-4648-a4a8-d25a5c5a75f3-kube-api-access-wm4zv\") pod \"cert-manager-858654f9db-s8pcl\" (UID: \"f0524d82-6920-4648-a4a8-d25a5c5a75f3\") " pod="cert-manager/cert-manager-858654f9db-s8pcl" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.756195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492rd\" (UniqueName: \"kubernetes.io/projected/e4392ff6-094c-408b-aa06-c6b6133c9941-kube-api-access-492rd\") pod \"cert-manager-cainjector-cf98fcc89-8pv2v\" (UID: \"e4392ff6-094c-408b-aa06-c6b6133c9941\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.889346 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-s8pcl" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.909893 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" Mar 19 09:36:52 crc kubenswrapper[4835]: I0319 09:36:52.924529 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:36:53 crc kubenswrapper[4835]: I0319 09:36:53.195432 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v"] Mar 19 09:36:53 crc kubenswrapper[4835]: I0319 09:36:53.249800 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" event={"ID":"e4392ff6-094c-408b-aa06-c6b6133c9941","Type":"ContainerStarted","Data":"e4142394ca86ac94720c96340e147002598d060424a5003fa3b857498675a3af"} Mar 19 09:36:53 crc kubenswrapper[4835]: I0319 09:36:53.432692 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-s8pcl"] Mar 19 09:36:53 crc kubenswrapper[4835]: W0319 09:36:53.438970 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0524d82_6920_4648_a4a8_d25a5c5a75f3.slice/crio-4d97dc245f8c26fcf051c271e4ab6ea41b15a553033d981f8835c25a3d47ef94 WatchSource:0}: Error finding container 4d97dc245f8c26fcf051c271e4ab6ea41b15a553033d981f8835c25a3d47ef94: Status 404 returned error can't find the container with id 4d97dc245f8c26fcf051c271e4ab6ea41b15a553033d981f8835c25a3d47ef94 Mar 19 09:36:53 crc kubenswrapper[4835]: I0319 09:36:53.462508 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rd6zz"] Mar 19 09:36:54 crc kubenswrapper[4835]: I0319 09:36:54.260682 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" event={"ID":"3cb1a290-5e83-4b84-80ab-a7d86e85ce98","Type":"ContainerStarted","Data":"0accc2ef2f70ed0abc65f000b1060e890ff6ad360c90ace3bc9f8fdcbff00fce"} Mar 19 09:36:54 crc kubenswrapper[4835]: I0319 09:36:54.263455 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s8pcl" event={"ID":"f0524d82-6920-4648-a4a8-d25a5c5a75f3","Type":"ContainerStarted","Data":"4d97dc245f8c26fcf051c271e4ab6ea41b15a553033d981f8835c25a3d47ef94"} Mar 19 09:36:56 crc kubenswrapper[4835]: I0319 09:36:56.276763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" event={"ID":"e4392ff6-094c-408b-aa06-c6b6133c9941","Type":"ContainerStarted","Data":"4a6ba535a5f6a8095897113e2efcd54a9b030310af5542f982310ba225d5fe7b"} Mar 19 09:36:56 crc kubenswrapper[4835]: I0319 09:36:56.303124 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pv2v" podStartSLOduration=1.670266 podStartE2EDuration="4.303108861s" podCreationTimestamp="2026-03-19 09:36:52 +0000 UTC" firstStartedPulling="2026-03-19 09:36:53.204799914 +0000 UTC m=+868.053398501" lastFinishedPulling="2026-03-19 09:36:55.837642775 +0000 UTC m=+870.686241362" observedRunningTime="2026-03-19 09:36:56.299093456 +0000 UTC m=+871.147692033" watchObservedRunningTime="2026-03-19 09:36:56.303108861 +0000 UTC m=+871.151707448" Mar 19 09:36:58 crc kubenswrapper[4835]: I0319 09:36:58.290249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-s8pcl" event={"ID":"f0524d82-6920-4648-a4a8-d25a5c5a75f3","Type":"ContainerStarted","Data":"f4fd4b1a827a69c2dc9f653b1d60afebe89c616f1c27cfb90da1103705eda5ed"} Mar 19 09:36:58 crc kubenswrapper[4835]: I0319 09:36:58.306445 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-s8pcl" podStartSLOduration=1.69571293 podStartE2EDuration="6.306425796s" podCreationTimestamp="2026-03-19 09:36:52 +0000 UTC" firstStartedPulling="2026-03-19 09:36:53.441446289 +0000 UTC m=+868.290044876" lastFinishedPulling="2026-03-19 09:36:58.052159155 +0000 UTC m=+872.900757742" observedRunningTime="2026-03-19 09:36:58.303371908 +0000 UTC m=+873.151970515" watchObservedRunningTime="2026-03-19 09:36:58.306425796 +0000 UTC m=+873.155024383" Mar 19 09:36:59 crc kubenswrapper[4835]: I0319 09:36:59.299364 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" event={"ID":"3cb1a290-5e83-4b84-80ab-a7d86e85ce98","Type":"ContainerStarted","Data":"63f718cf12490119880352552b3fa1ebc2a1d7df50613fd5f3461e45e839ec1b"} Mar 19 09:36:59 crc kubenswrapper[4835]: I0319 09:36:59.317724 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" podStartSLOduration=2.739339547 podStartE2EDuration="7.317705364s" podCreationTimestamp="2026-03-19 09:36:52 +0000 UTC" firstStartedPulling="2026-03-19 09:36:53.46620655 +0000 UTC m=+868.314805147" lastFinishedPulling="2026-03-19 09:36:58.044572377 +0000 UTC m=+872.893170964" observedRunningTime="2026-03-19 09:36:59.314983876 +0000 UTC m=+874.163582493" watchObservedRunningTime="2026-03-19 09:36:59.317705364 +0000 UTC m=+874.166303961" Mar 19 09:37:00 crc kubenswrapper[4835]: I0319 09:37:00.306085 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:37:06 crc kubenswrapper[4835]: I0319 09:37:06.422717 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:37:06 crc kubenswrapper[4835]: I0319 09:37:06.423639 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:37:06 crc kubenswrapper[4835]: I0319 09:37:06.423703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:37:06 crc kubenswrapper[4835]: I0319 09:37:06.424696 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:37:06 crc kubenswrapper[4835]: I0319 09:37:06.424824 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5" gracePeriod=600 Mar 19 09:37:07 crc kubenswrapper[4835]: I0319 09:37:07.359577 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5" exitCode=0 Mar 19 09:37:07 crc kubenswrapper[4835]: I0319 09:37:07.359616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5"} Mar 19 09:37:07 crc kubenswrapper[4835]: I0319 09:37:07.359957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b"} Mar 19 09:37:07 crc kubenswrapper[4835]: I0319 09:37:07.359974 4835 scope.go:117] "RemoveContainer" containerID="77af185870c05156a519ecf05e5136747068885eeb6e51d17ba26ef920e74b4d" Mar 19 09:37:07 crc kubenswrapper[4835]: I0319 09:37:07.927438 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.782239 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf"] Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.783873 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.785823 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.796035 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf"] Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.916466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.916547 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.916648 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h89c9\" (UniqueName: \"kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.960985 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx"] Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.962334 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:37 crc kubenswrapper[4835]: I0319 09:37:37.976047 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx"] Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.018297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.018353 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.018378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h89c9\" (UniqueName: \"kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.018857 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.019042 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.043610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h89c9\" (UniqueName: \"kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.101224 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.119365 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzf6h\" (UniqueName: \"kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.119414 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.119459 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.220777 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzf6h\" (UniqueName: \"kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.221098 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.221147 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.221843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.222024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.245975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzf6h\" (UniqueName: \"kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.282967 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.544705 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf"] Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.561618 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx"] Mar 19 09:37:38 crc kubenswrapper[4835]: W0319 09:37:38.564832 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0252447_92bc_4bc8_a2fe_0be1262b6c77.slice/crio-1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974 WatchSource:0}: Error finding container 1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974: Status 404 returned error can't find the container with id 1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974 Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.598775 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" event={"ID":"a59f270d-4331-4428-a35a-5426a9cb3676","Type":"ContainerStarted","Data":"77a1f06c86f331d459f9c5265121322faac4224f528daadef7003f9878981ee5"} Mar 19 09:37:38 crc kubenswrapper[4835]: I0319 09:37:38.600013 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" event={"ID":"c0252447-92bc-4bc8-a2fe-0be1262b6c77","Type":"ContainerStarted","Data":"1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974"} Mar 19 09:37:39 crc kubenswrapper[4835]: I0319 09:37:39.607513 4835 generic.go:334] "Generic (PLEG): container finished" podID="a59f270d-4331-4428-a35a-5426a9cb3676" containerID="31336d1e04f2fb00e1ca5c1f8952e714fd2b943632da9a7eb1a7c68dcf82f162" exitCode=0 Mar 19 09:37:39 crc kubenswrapper[4835]: I0319 09:37:39.607589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" event={"ID":"a59f270d-4331-4428-a35a-5426a9cb3676","Type":"ContainerDied","Data":"31336d1e04f2fb00e1ca5c1f8952e714fd2b943632da9a7eb1a7c68dcf82f162"} Mar 19 09:37:39 crc kubenswrapper[4835]: I0319 09:37:39.609284 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerID="1b90a79c137a671e0140b52d7a1167ccc9fad94a891293d9ca2624ad8a285e8b" exitCode=0 Mar 19 09:37:39 crc kubenswrapper[4835]: I0319 09:37:39.609327 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" event={"ID":"c0252447-92bc-4bc8-a2fe-0be1262b6c77","Type":"ContainerDied","Data":"1b90a79c137a671e0140b52d7a1167ccc9fad94a891293d9ca2624ad8a285e8b"} Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.520558 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.522404 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.534927 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.626435 4835 generic.go:334] "Generic (PLEG): container finished" podID="a59f270d-4331-4428-a35a-5426a9cb3676" containerID="0abb216c904594962efd2ca4641c157cbe528fdbd7314f358e868a5fe5c40266" exitCode=0 Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.626503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" event={"ID":"a59f270d-4331-4428-a35a-5426a9cb3676","Type":"ContainerDied","Data":"0abb216c904594962efd2ca4641c157cbe528fdbd7314f358e868a5fe5c40266"} Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.628849 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerID="8881342004a741d9eb9f56c99624cafbf1850c40344ee1af346e3d66538425f4" exitCode=0 Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.628886 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" event={"ID":"c0252447-92bc-4bc8-a2fe-0be1262b6c77","Type":"ContainerDied","Data":"8881342004a741d9eb9f56c99624cafbf1850c40344ee1af346e3d66538425f4"} Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.678404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.678494 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmd9\" (UniqueName: \"kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.678570 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.779645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.780077 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.780198 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmd9\" (UniqueName: \"kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.780254 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.780540 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.804900 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmd9\" (UniqueName: \"kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9\") pod \"redhat-operators-xcstw\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:41 crc kubenswrapper[4835]: I0319 09:37:41.841005 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.318249 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:42 crc kubenswrapper[4835]: W0319 09:37:42.322431 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e225de_9ba7_4160_8282_5e93557e3281.slice/crio-f9f8186494ba6e9ebac54b71bd8554449df5a62faa0c544afca1b53d242244a0 WatchSource:0}: Error finding container f9f8186494ba6e9ebac54b71bd8554449df5a62faa0c544afca1b53d242244a0: Status 404 returned error can't find the container with id f9f8186494ba6e9ebac54b71bd8554449df5a62faa0c544afca1b53d242244a0 Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.636970 4835 generic.go:334] "Generic (PLEG): container finished" podID="a59f270d-4331-4428-a35a-5426a9cb3676" containerID="d63a4e775922e087f45798f6cd3c44d3237beb03e54a8a2906c22c5fdff79813" exitCode=0 Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.637032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" event={"ID":"a59f270d-4331-4428-a35a-5426a9cb3676","Type":"ContainerDied","Data":"d63a4e775922e087f45798f6cd3c44d3237beb03e54a8a2906c22c5fdff79813"} Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.639174 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerID="381fb0d12a6dd0e71d74f4b291efa7225c0e96a5562102cc9a01e7d5b1cb1b36" exitCode=0 Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.639210 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" event={"ID":"c0252447-92bc-4bc8-a2fe-0be1262b6c77","Type":"ContainerDied","Data":"381fb0d12a6dd0e71d74f4b291efa7225c0e96a5562102cc9a01e7d5b1cb1b36"} Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.640376 4835 generic.go:334] "Generic (PLEG): container finished" podID="79e225de-9ba7-4160-8282-5e93557e3281" containerID="14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016" exitCode=0 Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.640411 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerDied","Data":"14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016"} Mar 19 09:37:42 crc kubenswrapper[4835]: I0319 09:37:42.640433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerStarted","Data":"f9f8186494ba6e9ebac54b71bd8554449df5a62faa0c544afca1b53d242244a0"} Mar 19 09:37:43 crc kubenswrapper[4835]: I0319 09:37:43.659343 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerStarted","Data":"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc"} Mar 19 09:37:43 crc kubenswrapper[4835]: I0319 09:37:43.947102 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:43 crc kubenswrapper[4835]: I0319 09:37:43.953732 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.114937 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util\") pod \"a59f270d-4331-4428-a35a-5426a9cb3676\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.114999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzf6h\" (UniqueName: \"kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h\") pod \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.115046 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util\") pod \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.115111 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h89c9\" (UniqueName: \"kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9\") pod \"a59f270d-4331-4428-a35a-5426a9cb3676\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.115150 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle\") pod \"a59f270d-4331-4428-a35a-5426a9cb3676\" (UID: \"a59f270d-4331-4428-a35a-5426a9cb3676\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.115191 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle\") pod \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\" (UID: \"c0252447-92bc-4bc8-a2fe-0be1262b6c77\") " Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.117870 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle" (OuterVolumeSpecName: "bundle") pod "a59f270d-4331-4428-a35a-5426a9cb3676" (UID: "a59f270d-4331-4428-a35a-5426a9cb3676"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.118917 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle" (OuterVolumeSpecName: "bundle") pod "c0252447-92bc-4bc8-a2fe-0be1262b6c77" (UID: "c0252447-92bc-4bc8-a2fe-0be1262b6c77"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.123997 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h" (OuterVolumeSpecName: "kube-api-access-nzf6h") pod "c0252447-92bc-4bc8-a2fe-0be1262b6c77" (UID: "c0252447-92bc-4bc8-a2fe-0be1262b6c77"). InnerVolumeSpecName "kube-api-access-nzf6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.124061 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9" (OuterVolumeSpecName: "kube-api-access-h89c9") pod "a59f270d-4331-4428-a35a-5426a9cb3676" (UID: "a59f270d-4331-4428-a35a-5426a9cb3676"). InnerVolumeSpecName "kube-api-access-h89c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.216618 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h89c9\" (UniqueName: \"kubernetes.io/projected/a59f270d-4331-4428-a35a-5426a9cb3676-kube-api-access-h89c9\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.216679 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.216689 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.216697 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzf6h\" (UniqueName: \"kubernetes.io/projected/c0252447-92bc-4bc8-a2fe-0be1262b6c77-kube-api-access-nzf6h\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.675401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" event={"ID":"a59f270d-4331-4428-a35a-5426a9cb3676","Type":"ContainerDied","Data":"77a1f06c86f331d459f9c5265121322faac4224f528daadef7003f9878981ee5"} Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.675451 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77a1f06c86f331d459f9c5265121322faac4224f528daadef7003f9878981ee5" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.675420 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.678392 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.678418 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx" event={"ID":"c0252447-92bc-4bc8-a2fe-0be1262b6c77","Type":"ContainerDied","Data":"1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974"} Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.678465 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1041f220a65fe1699c28cb0a9043e19012dc5d0ad79f7267d4bc7474ac6be974" Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.680874 4835 generic.go:334] "Generic (PLEG): container finished" podID="79e225de-9ba7-4160-8282-5e93557e3281" containerID="a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc" exitCode=0 Mar 19 09:37:44 crc kubenswrapper[4835]: I0319 09:37:44.680924 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerDied","Data":"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc"} Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.058887 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util" (OuterVolumeSpecName: "util") pod "a59f270d-4331-4428-a35a-5426a9cb3676" (UID: "a59f270d-4331-4428-a35a-5426a9cb3676"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.078237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util" (OuterVolumeSpecName: "util") pod "c0252447-92bc-4bc8-a2fe-0be1262b6c77" (UID: "c0252447-92bc-4bc8-a2fe-0be1262b6c77"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.130146 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a59f270d-4331-4428-a35a-5426a9cb3676-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.130183 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0252447-92bc-4bc8-a2fe-0be1262b6c77-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.687540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerStarted","Data":"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd"} Mar 19 09:37:45 crc kubenswrapper[4835]: I0319 09:37:45.714049 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xcstw" podStartSLOduration=2.128628629 podStartE2EDuration="4.714030356s" podCreationTimestamp="2026-03-19 09:37:41 +0000 UTC" firstStartedPulling="2026-03-19 09:37:42.641732649 +0000 UTC m=+917.490331236" lastFinishedPulling="2026-03-19 09:37:45.227134356 +0000 UTC m=+920.075732963" observedRunningTime="2026-03-19 09:37:45.710811537 +0000 UTC m=+920.559410134" watchObservedRunningTime="2026-03-19 09:37:45.714030356 +0000 UTC m=+920.562628953" Mar 19 09:37:51 crc kubenswrapper[4835]: I0319 09:37:51.841316 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:51 crc kubenswrapper[4835]: I0319 09:37:51.841983 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:51 crc kubenswrapper[4835]: I0319 09:37:51.898308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:52 crc kubenswrapper[4835]: I0319 09:37:52.806256 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:54 crc kubenswrapper[4835]: I0319 09:37:54.513118 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:54 crc kubenswrapper[4835]: I0319 09:37:54.748917 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xcstw" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="registry-server" containerID="cri-o://ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd" gracePeriod=2 Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.181559 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.272369 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities\") pod \"79e225de-9ba7-4160-8282-5e93557e3281\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.272492 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfmd9\" (UniqueName: \"kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9\") pod \"79e225de-9ba7-4160-8282-5e93557e3281\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.272514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content\") pod \"79e225de-9ba7-4160-8282-5e93557e3281\" (UID: \"79e225de-9ba7-4160-8282-5e93557e3281\") " Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.273485 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities" (OuterVolumeSpecName: "utilities") pod "79e225de-9ba7-4160-8282-5e93557e3281" (UID: "79e225de-9ba7-4160-8282-5e93557e3281"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.277199 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9" (OuterVolumeSpecName: "kube-api-access-wfmd9") pod "79e225de-9ba7-4160-8282-5e93557e3281" (UID: "79e225de-9ba7-4160-8282-5e93557e3281"). InnerVolumeSpecName "kube-api-access-wfmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.374316 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.374345 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfmd9\" (UniqueName: \"kubernetes.io/projected/79e225de-9ba7-4160-8282-5e93557e3281-kube-api-access-wfmd9\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.406233 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79e225de-9ba7-4160-8282-5e93557e3281" (UID: "79e225de-9ba7-4160-8282-5e93557e3281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.475965 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e225de-9ba7-4160-8282-5e93557e3281-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.755573 4835 generic.go:334] "Generic (PLEG): container finished" podID="79e225de-9ba7-4160-8282-5e93557e3281" containerID="ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd" exitCode=0 Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.755609 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerDied","Data":"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd"} Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.755658 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xcstw" event={"ID":"79e225de-9ba7-4160-8282-5e93557e3281","Type":"ContainerDied","Data":"f9f8186494ba6e9ebac54b71bd8554449df5a62faa0c544afca1b53d242244a0"} Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.755679 4835 scope.go:117] "RemoveContainer" containerID="ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.755625 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xcstw" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.778464 4835 scope.go:117] "RemoveContainer" containerID="a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.794808 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.800391 4835 scope.go:117] "RemoveContainer" containerID="14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.802707 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xcstw"] Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.827668 4835 scope.go:117] "RemoveContainer" containerID="ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd" Mar 19 09:37:55 crc kubenswrapper[4835]: E0319 09:37:55.828131 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd\": container with ID starting with ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd not found: ID does not exist" containerID="ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.828173 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd"} err="failed to get container status \"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd\": rpc error: code = NotFound desc = could not find container \"ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd\": container with ID starting with ba56852db4a856bb1b3ecba059331ec6431894f42876c3b1b09b4d1e859f8dcd not found: ID does not exist" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.828199 4835 scope.go:117] "RemoveContainer" containerID="a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc" Mar 19 09:37:55 crc kubenswrapper[4835]: E0319 09:37:55.828517 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc\": container with ID starting with a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc not found: ID does not exist" containerID="a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.828563 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc"} err="failed to get container status \"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc\": rpc error: code = NotFound desc = could not find container \"a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc\": container with ID starting with a07a49e26141f30c73db9c247be11aa2b7e3b4ad6d8547b06dd22b451f5bbcdc not found: ID does not exist" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.828595 4835 scope.go:117] "RemoveContainer" containerID="14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016" Mar 19 09:37:55 crc kubenswrapper[4835]: E0319 09:37:55.828952 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016\": container with ID starting with 14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016 not found: ID does not exist" containerID="14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016" Mar 19 09:37:55 crc kubenswrapper[4835]: I0319 09:37:55.828984 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016"} err="failed to get container status \"14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016\": rpc error: code = NotFound desc = could not find container \"14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016\": container with ID starting with 14db21b081d11adc040dfb0559c9be424a1480495447e15e0923f02c9e412016 not found: ID does not exist" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082221 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc"] Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082526 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082539 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082555 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082560 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082568 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="registry-server" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082574 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="registry-server" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082587 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="util" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082593 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="util" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082603 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="pull" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082608 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="pull" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082623 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="util" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082631 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="util" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082644 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="pull" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082651 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="pull" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082663 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="extract-content" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082671 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="extract-content" Mar 19 09:37:56 crc kubenswrapper[4835]: E0319 09:37:56.082684 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="extract-utilities" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082692 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="extract-utilities" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082843 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e225de-9ba7-4160-8282-5e93557e3281" containerName="registry-server" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082858 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59f270d-4331-4428-a35a-5426a9cb3676" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.082877 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0252447-92bc-4bc8-a2fe-0be1262b6c77" containerName="extract" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.083375 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.086730 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.087141 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-cdp4q" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.087322 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.091190 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc"] Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.185735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw54\" (UniqueName: \"kubernetes.io/projected/435b5515-e762-42eb-b71f-afaee5fddc4a-kube-api-access-2gw54\") pod \"cluster-logging-operator-66689c4bbf-vjzsc\" (UID: \"435b5515-e762-42eb-b71f-afaee5fddc4a\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.287033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw54\" (UniqueName: \"kubernetes.io/projected/435b5515-e762-42eb-b71f-afaee5fddc4a-kube-api-access-2gw54\") pod \"cluster-logging-operator-66689c4bbf-vjzsc\" (UID: \"435b5515-e762-42eb-b71f-afaee5fddc4a\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.304608 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw54\" (UniqueName: \"kubernetes.io/projected/435b5515-e762-42eb-b71f-afaee5fddc4a-kube-api-access-2gw54\") pod \"cluster-logging-operator-66689c4bbf-vjzsc\" (UID: \"435b5515-e762-42eb-b71f-afaee5fddc4a\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.410649 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.410889 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e225de-9ba7-4160-8282-5e93557e3281" path="/var/lib/kubelet/pods/79e225de-9ba7-4160-8282-5e93557e3281/volumes" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.858411 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr"] Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.860491 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.867695 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr"] Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.872016 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.872339 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.872408 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-vgncl" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.872443 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.872543 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.873885 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.999614 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-webhook-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.999698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3a2991b5-2e25-4afa-9941-d955aad0dc37-manager-config\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.999716 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmc2\" (UniqueName: \"kubernetes.io/projected/3a2991b5-2e25-4afa-9941-d955aad0dc37-kube-api-access-jsmc2\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.999768 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:56 crc kubenswrapper[4835]: I0319 09:37:56.999800 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-apiservice-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.072375 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc"] Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.100658 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmc2\" (UniqueName: \"kubernetes.io/projected/3a2991b5-2e25-4afa-9941-d955aad0dc37-kube-api-access-jsmc2\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.100707 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3a2991b5-2e25-4afa-9941-d955aad0dc37-manager-config\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.100769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.100806 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-apiservice-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.100868 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-webhook-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.101715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3a2991b5-2e25-4afa-9941-d955aad0dc37-manager-config\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.108433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.129214 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmc2\" (UniqueName: \"kubernetes.io/projected/3a2991b5-2e25-4afa-9941-d955aad0dc37-kube-api-access-jsmc2\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.219227 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-webhook-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.219611 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2991b5-2e25-4afa-9941-d955aad0dc37-apiservice-cert\") pod \"loki-operator-controller-manager-668b645cb5-fhzgr\" (UID: \"3a2991b5-2e25-4afa-9941-d955aad0dc37\") " pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.220127 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.439718 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr"] Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.810790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" event={"ID":"3a2991b5-2e25-4afa-9941-d955aad0dc37","Type":"ContainerStarted","Data":"59d3373703e426637baebf32b0fe0431ca90dba616a8c13215f7154b1885bb45"} Mar 19 09:37:57 crc kubenswrapper[4835]: I0319 09:37:57.818496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" event={"ID":"435b5515-e762-42eb-b71f-afaee5fddc4a","Type":"ContainerStarted","Data":"1feeb159029ea520b348c17ad06e3424a50e641decb77db21c9305313c1bc034"} Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.128796 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565218-5w5qk"] Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.130141 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.134259 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565218-5w5qk"] Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.134794 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.135395 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.139066 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.260760 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wt7\" (UniqueName: \"kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7\") pod \"auto-csr-approver-29565218-5w5qk\" (UID: \"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2\") " pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.362142 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wt7\" (UniqueName: \"kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7\") pod \"auto-csr-approver-29565218-5w5qk\" (UID: \"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2\") " pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.382342 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wt7\" (UniqueName: \"kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7\") pod \"auto-csr-approver-29565218-5w5qk\" (UID: \"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2\") " pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.465093 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:00 crc kubenswrapper[4835]: I0319 09:38:00.935502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565218-5w5qk"] Mar 19 09:38:04 crc kubenswrapper[4835]: I0319 09:38:04.884467 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" event={"ID":"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2","Type":"ContainerStarted","Data":"e1befc31d697c56dd21160e096728f2d7c54e7f91b932cccf9e87744e82c4402"} Mar 19 09:38:09 crc kubenswrapper[4835]: I0319 09:38:09.926271 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" event={"ID":"435b5515-e762-42eb-b71f-afaee5fddc4a","Type":"ContainerStarted","Data":"50573e4c2d0251c7321fae56900af883c634256ff93da04a9a68ab33d99ed533"} Mar 19 09:38:09 crc kubenswrapper[4835]: I0319 09:38:09.927946 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" event={"ID":"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2","Type":"ContainerStarted","Data":"204778b8018b471746e0832d0c509e15e163f11dd502ece7270d0707d5ce24b9"} Mar 19 09:38:09 crc kubenswrapper[4835]: I0319 09:38:09.930269 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" event={"ID":"3a2991b5-2e25-4afa-9941-d955aad0dc37","Type":"ContainerStarted","Data":"577a1eac5da3d9739a9fe6c383debba943cf50c2556f163670be9b23dfb5b54c"} Mar 19 09:38:10 crc kubenswrapper[4835]: I0319 09:38:10.938066 4835 generic.go:334] "Generic (PLEG): container finished" podID="ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" containerID="204778b8018b471746e0832d0c509e15e163f11dd502ece7270d0707d5ce24b9" exitCode=0 Mar 19 09:38:10 crc kubenswrapper[4835]: I0319 09:38:10.938143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" event={"ID":"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2","Type":"ContainerDied","Data":"204778b8018b471746e0832d0c509e15e163f11dd502ece7270d0707d5ce24b9"} Mar 19 09:38:10 crc kubenswrapper[4835]: I0319 09:38:10.953977 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-vjzsc" podStartSLOduration=2.502078758 podStartE2EDuration="14.953958134s" podCreationTimestamp="2026-03-19 09:37:56 +0000 UTC" firstStartedPulling="2026-03-19 09:37:57.088305212 +0000 UTC m=+931.936903809" lastFinishedPulling="2026-03-19 09:38:09.540184598 +0000 UTC m=+944.388783185" observedRunningTime="2026-03-19 09:38:10.952774932 +0000 UTC m=+945.801373539" watchObservedRunningTime="2026-03-19 09:38:10.953958134 +0000 UTC m=+945.802556721" Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.196295 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.354933 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59wt7\" (UniqueName: \"kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7\") pod \"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2\" (UID: \"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2\") " Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.359516 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7" (OuterVolumeSpecName: "kube-api-access-59wt7") pod "ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" (UID: "ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2"). InnerVolumeSpecName "kube-api-access-59wt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.456759 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59wt7\" (UniqueName: \"kubernetes.io/projected/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2-kube-api-access-59wt7\") on node \"crc\" DevicePath \"\"" Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.950060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" event={"ID":"ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2","Type":"ContainerDied","Data":"e1befc31d697c56dd21160e096728f2d7c54e7f91b932cccf9e87744e82c4402"} Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.950096 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1befc31d697c56dd21160e096728f2d7c54e7f91b932cccf9e87744e82c4402" Mar 19 09:38:12 crc kubenswrapper[4835]: I0319 09:38:12.950112 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565218-5w5qk" Mar 19 09:38:13 crc kubenswrapper[4835]: I0319 09:38:13.245142 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565212-xrbm4"] Mar 19 09:38:13 crc kubenswrapper[4835]: I0319 09:38:13.249888 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565212-xrbm4"] Mar 19 09:38:14 crc kubenswrapper[4835]: I0319 09:38:14.411571 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545bb00f-039b-40bc-923f-496674c4b221" path="/var/lib/kubelet/pods/545bb00f-039b-40bc-923f-496674c4b221/volumes" Mar 19 09:38:27 crc kubenswrapper[4835]: I0319 09:38:27.043139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" event={"ID":"3a2991b5-2e25-4afa-9941-d955aad0dc37","Type":"ContainerStarted","Data":"4a61baa78d92c0acf6c6f622229189ca32b6260b71301ddbadce9bb65bd3d3f9"} Mar 19 09:38:27 crc kubenswrapper[4835]: I0319 09:38:27.043723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:38:27 crc kubenswrapper[4835]: I0319 09:38:27.046379 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 09:38:27 crc kubenswrapper[4835]: I0319 09:38:27.072064 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podStartSLOduration=1.635127384 podStartE2EDuration="31.072040739s" podCreationTimestamp="2026-03-19 09:37:56 +0000 UTC" firstStartedPulling="2026-03-19 09:37:57.445351765 +0000 UTC m=+932.293950352" lastFinishedPulling="2026-03-19 09:38:26.88226512 +0000 UTC m=+961.730863707" observedRunningTime="2026-03-19 09:38:27.068471629 +0000 UTC m=+961.917070216" watchObservedRunningTime="2026-03-19 09:38:27.072040739 +0000 UTC m=+961.920639326" Mar 19 09:38:27 crc kubenswrapper[4835]: I0319 09:38:27.280177 4835 scope.go:117] "RemoveContainer" containerID="7d5d6c63cc1fe44b111a2a6b05400858e8dbb7d8e396e04fd0e5d1b4d01c1877" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.721241 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 19 09:38:30 crc kubenswrapper[4835]: E0319 09:38:30.721804 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" containerName="oc" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.721822 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" containerName="oc" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.721965 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" containerName="oc" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.722374 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.727425 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.727805 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.728057 4835 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-bmbs8" Mar 19 09:38:30 crc kubenswrapper[4835]: I0319 09:38:30.732661 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.360256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tx7\" (UniqueName: \"kubernetes.io/projected/11bc80a2-608a-4a50-8b5b-28f6deef8df2-kube-api-access-h2tx7\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.360342 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.461358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tx7\" (UniqueName: \"kubernetes.io/projected/11bc80a2-608a-4a50-8b5b-28f6deef8df2-kube-api-access-h2tx7\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.461686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.465076 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.465125 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b6c58389b94f5376e26c4670bbe64eefcce4f0f570a3ecd3868c0a651c1eaf9/globalmount\"" pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.483277 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tx7\" (UniqueName: \"kubernetes.io/projected/11bc80a2-608a-4a50-8b5b-28f6deef8df2-kube-api-access-h2tx7\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.490663 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f143f096-70d0-4eb8-90d6-a9670999a9a2\") pod \"minio\" (UID: \"11bc80a2-608a-4a50-8b5b-28f6deef8df2\") " pod="minio-dev/minio" Mar 19 09:38:31 crc kubenswrapper[4835]: I0319 09:38:31.639360 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 19 09:38:32 crc kubenswrapper[4835]: I0319 09:38:32.058722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 19 09:38:32 crc kubenswrapper[4835]: I0319 09:38:32.413925 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"11bc80a2-608a-4a50-8b5b-28f6deef8df2","Type":"ContainerStarted","Data":"7df72297cc0c4a876a21a3697aae4b074ee63413b82e75390fc444b9af52d1cd"} Mar 19 09:38:32 crc kubenswrapper[4835]: I0319 09:38:32.737821 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:32 crc kubenswrapper[4835]: I0319 09:38:32.739267 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:32 crc kubenswrapper[4835]: I0319 09:38:32.756187 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.193860 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2f8v\" (UniqueName: \"kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.194221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.194383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.295015 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2f8v\" (UniqueName: \"kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.295070 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.295138 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.295778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.295855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.312561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2f8v\" (UniqueName: \"kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v\") pod \"redhat-marketplace-vrl7f\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.492852 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:33 crc kubenswrapper[4835]: I0319 09:38:33.948692 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:34 crc kubenswrapper[4835]: I0319 09:38:34.426818 4835 generic.go:334] "Generic (PLEG): container finished" podID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerID="fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92" exitCode=0 Mar 19 09:38:34 crc kubenswrapper[4835]: I0319 09:38:34.426907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerDied","Data":"fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92"} Mar 19 09:38:34 crc kubenswrapper[4835]: I0319 09:38:34.427137 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerStarted","Data":"f8f7d857f1fa0749dd40c4525b9d37545253064133d2d5d5165cd12ae2f931ec"} Mar 19 09:38:37 crc kubenswrapper[4835]: I0319 09:38:37.446537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"11bc80a2-608a-4a50-8b5b-28f6deef8df2","Type":"ContainerStarted","Data":"a9c56029d337cd05ab65e23678d14fc2f490ca841ed8e02a3725c8962b0b5c99"} Mar 19 09:38:37 crc kubenswrapper[4835]: I0319 09:38:37.449553 4835 generic.go:334] "Generic (PLEG): container finished" podID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerID="cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235" exitCode=0 Mar 19 09:38:37 crc kubenswrapper[4835]: I0319 09:38:37.449615 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerDied","Data":"cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235"} Mar 19 09:38:37 crc kubenswrapper[4835]: I0319 09:38:37.466394 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.964957826 podStartE2EDuration="9.466373155s" podCreationTimestamp="2026-03-19 09:38:28 +0000 UTC" firstStartedPulling="2026-03-19 09:38:32.069191943 +0000 UTC m=+966.917790550" lastFinishedPulling="2026-03-19 09:38:36.570607282 +0000 UTC m=+971.419205879" observedRunningTime="2026-03-19 09:38:37.464140353 +0000 UTC m=+972.312738940" watchObservedRunningTime="2026-03-19 09:38:37.466373155 +0000 UTC m=+972.314971742" Mar 19 09:38:38 crc kubenswrapper[4835]: I0319 09:38:38.457550 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerStarted","Data":"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa"} Mar 19 09:38:38 crc kubenswrapper[4835]: I0319 09:38:38.488595 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vrl7f" podStartSLOduration=3.489096163 podStartE2EDuration="6.488578788s" podCreationTimestamp="2026-03-19 09:38:32 +0000 UTC" firstStartedPulling="2026-03-19 09:38:34.837571752 +0000 UTC m=+969.686170329" lastFinishedPulling="2026-03-19 09:38:37.837054377 +0000 UTC m=+972.685652954" observedRunningTime="2026-03-19 09:38:38.486005497 +0000 UTC m=+973.334604084" watchObservedRunningTime="2026-03-19 09:38:38.488578788 +0000 UTC m=+973.337177375" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.890166 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-rznlv"] Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.891534 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.893731 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-9sxfg" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.894225 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.894550 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.894718 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.895625 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 19 09:38:41 crc kubenswrapper[4835]: I0319 09:38:41.908004 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-rznlv"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.028677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.029009 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.029037 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.029136 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rlq\" (UniqueName: \"kubernetes.io/projected/82e92071-6291-4ff2-971a-c658d2e001ed-kube-api-access-97rlq\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.029164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-config\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.038283 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.043473 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.046143 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.046344 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.046449 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.060143 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.103533 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.112236 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.112357 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.118061 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.118165 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.130127 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.130189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.130269 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlq\" (UniqueName: \"kubernetes.io/projected/82e92071-6291-4ff2-971a-c658d2e001ed-kube-api-access-97rlq\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.130309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-config\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.130365 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.131310 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.133175 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82e92071-6291-4ff2-971a-c658d2e001ed-config\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.142884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.146420 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/82e92071-6291-4ff2-971a-c658d2e001ed-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.159028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rlq\" (UniqueName: \"kubernetes.io/projected/82e92071-6291-4ff2-971a-c658d2e001ed-kube-api-access-97rlq\") pod \"logging-loki-distributor-9c6b6d984-rznlv\" (UID: \"82e92071-6291-4ff2-971a-c658d2e001ed\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.209164 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.210141 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.212700 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.212913 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.212926 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.215458 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.215485 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-h7ncv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.216028 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kdqg\" (UniqueName: \"kubernetes.io/projected/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-kube-api-access-9kdqg\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231384 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231406 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231427 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-config\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231455 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tht5g\" (UniqueName: \"kubernetes.io/projected/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-kube-api-access-tht5g\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231519 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231540 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231559 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.231580 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.244961 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.247616 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.274188 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.304337 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.328455 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342509 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342856 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tht5g\" (UniqueName: \"kubernetes.io/projected/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-kube-api-access-tht5g\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342883 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342904 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342927 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.342973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343019 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kdqg\" (UniqueName: \"kubernetes.io/projected/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-kube-api-access-9kdqg\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343062 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvt5\" (UniqueName: \"kubernetes.io/projected/2ff291e5-8364-4627-be0f-51c9532e46ee-kube-api-access-ksvt5\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343140 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343158 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-config\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343181 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343208 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.343232 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.345737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-config\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.347414 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-config\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.348241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.348878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.353258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.354300 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.354795 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.356889 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.357588 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.361957 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tht5g\" (UniqueName: \"kubernetes.io/projected/9b90ec5a-5d56-4267-bb4e-1fbcdecff021-kube-api-access-tht5g\") pod \"logging-loki-querier-6dcbdf8bb8-cm5jv\" (UID: \"9b90ec5a-5d56-4267-bb4e-1fbcdecff021\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.371288 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.381085 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kdqg\" (UniqueName: \"kubernetes.io/projected/e0bc0a8b-4a2d-4460-b984-7a2279e1a424-kube-api-access-9kdqg\") pod \"logging-loki-query-frontend-ff66c4dc9-5qfh9\" (UID: \"e0bc0a8b-4a2d-4460-b984-7a2279e1a424\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.434184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446434 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446504 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446540 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksvt5\" (UniqueName: \"kubernetes.io/projected/2ff291e5-8364-4627-be0f-51c9532e46ee-kube-api-access-ksvt5\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446615 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446651 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446679 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446705 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.446966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.447015 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.447045 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk75x\" (UniqueName: \"kubernetes.io/projected/75ce358f-5f03-401f-bdf8-27a7e5309227-kube-api-access-rk75x\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.449032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.450436 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.449667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.452299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.453149 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.454461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.455201 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2ff291e5-8364-4627-be0f-51c9532e46ee-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.466462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksvt5\" (UniqueName: \"kubernetes.io/projected/2ff291e5-8364-4627-be0f-51c9532e46ee-kube-api-access-ksvt5\") pod \"logging-loki-gateway-5d45f4dcf6-4f49c\" (UID: \"2ff291e5-8364-4627-be0f-51c9532e46ee\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.534378 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk75x\" (UniqueName: \"kubernetes.io/projected/75ce358f-5f03-401f-bdf8-27a7e5309227-kube-api-access-rk75x\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548267 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548398 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.548430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.549723 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.550061 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.550289 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-lokistack-gateway\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.550532 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/75ce358f-5f03-401f-bdf8-27a7e5309227-rbac\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.553295 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tenants\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.555231 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.555273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/75ce358f-5f03-401f-bdf8-27a7e5309227-tls-secret\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.571055 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk75x\" (UniqueName: \"kubernetes.io/projected/75ce358f-5f03-401f-bdf8-27a7e5309227-kube-api-access-rk75x\") pod \"logging-loki-gateway-5d45f4dcf6-hkx57\" (UID: \"75ce358f-5f03-401f-bdf8-27a7e5309227\") " pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.603936 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.703515 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-rznlv"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.831012 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9"] Mar 19 09:38:42 crc kubenswrapper[4835]: W0319 09:38:42.833501 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0bc0a8b_4a2d_4460_b984_7a2279e1a424.slice/crio-6f5bffd5a326c2c63a2c97e869e7a3b2bf84d8db25d57500a342bb6c1dd5f21a WatchSource:0}: Error finding container 6f5bffd5a326c2c63a2c97e869e7a3b2bf84d8db25d57500a342bb6c1dd5f21a: Status 404 returned error can't find the container with id 6f5bffd5a326c2c63a2c97e869e7a3b2bf84d8db25d57500a342bb6c1dd5f21a Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.857424 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c"] Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.861810 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv"] Mar 19 09:38:42 crc kubenswrapper[4835]: W0319 09:38:42.865126 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff291e5_8364_4627_be0f_51c9532e46ee.slice/crio-730a6cf85c75a72ace3d0e0a60c583d13f8de27b1b38ed8ffc1d78cdc0d2a459 WatchSource:0}: Error finding container 730a6cf85c75a72ace3d0e0a60c583d13f8de27b1b38ed8ffc1d78cdc0d2a459: Status 404 returned error can't find the container with id 730a6cf85c75a72ace3d0e0a60c583d13f8de27b1b38ed8ffc1d78cdc0d2a459 Mar 19 09:38:42 crc kubenswrapper[4835]: I0319 09:38:42.952877 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57"] Mar 19 09:38:42 crc kubenswrapper[4835]: W0319 09:38:42.957953 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ce358f_5f03_401f_bdf8_27a7e5309227.slice/crio-a2e80c6bb637ad0456634db0d3c27bd7df1918c4e393d1223859b37a336df521 WatchSource:0}: Error finding container a2e80c6bb637ad0456634db0d3c27bd7df1918c4e393d1223859b37a336df521: Status 404 returned error can't find the container with id a2e80c6bb637ad0456634db0d3c27bd7df1918c4e393d1223859b37a336df521 Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.050848 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.051694 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.054429 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.057303 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.067419 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.091104 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.092228 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.097446 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.097577 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.123032 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159284 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159421 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159466 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159560 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159594 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztqbj\" (UniqueName: \"kubernetes.io/projected/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-kube-api-access-ztqbj\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-config\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.159639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.170965 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.171876 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.174397 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.174405 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.186805 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261018 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261085 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261110 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261216 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztqbj\" (UniqueName: \"kubernetes.io/projected/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-kube-api-access-ztqbj\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261263 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-20552b06-44ca-49b7-b773-634d5b66321b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20552b06-44ca-49b7-b773-634d5b66321b\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.262651 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263173 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-config\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.261733 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-config\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263258 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-247d9\" (UniqueName: \"kubernetes.io/projected/a0a07ea9-b7e5-4679-9b63-c6027e42f279-kube-api-access-247d9\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263337 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-config\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263453 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263606 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263686 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263795 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzcgd\" (UniqueName: \"kubernetes.io/projected/b54655c1-accb-4df5-98b0-f01dbcfe83f7-kube-api-access-wzcgd\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263873 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.263914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.264033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.267512 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.267571 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/758eef72140d0f0b777ae5bee4a50d57eb765b581651cd45bee610c9ca9df9ef/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.268525 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.268631 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.269150 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.269197 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dfcd5a9513fe04e37c6285c62716dc2fdcef1c5bae0e2a47907a55c46d6483f6/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.270479 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.283398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztqbj\" (UniqueName: \"kubernetes.io/projected/b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57-kube-api-access-ztqbj\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.291024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ad328127-08a3-4d3c-b4cd-28a836727a41\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.291139 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff21c03e-2884-4274-adbc-d0ca27fd85b2\") pod \"logging-loki-ingester-0\" (UID: \"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57\") " pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365132 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365192 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-20552b06-44ca-49b7-b773-634d5b66321b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20552b06-44ca-49b7-b773-634d5b66321b\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-247d9\" (UniqueName: \"kubernetes.io/projected/a0a07ea9-b7e5-4679-9b63-c6027e42f279-kube-api-access-247d9\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365240 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-config\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365263 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365280 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzcgd\" (UniqueName: \"kubernetes.io/projected/b54655c1-accb-4df5-98b0-f01dbcfe83f7-kube-api-access-wzcgd\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.365373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.366794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-config\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.367954 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.369249 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.369257 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-config\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.369283 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-20552b06-44ca-49b7-b773-634d5b66321b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20552b06-44ca-49b7-b773-634d5b66321b\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/84887e2dc72c994a4821331179331ef48f238c5f9e5b1d6bb94c2963edb9dd85/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.370174 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.370440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.370507 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.371863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.372622 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.372656 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b54bd213326b682cd8d0ef92e5224c7641f7478e05b884dc64aec4fb02084687/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.372790 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/b54655c1-accb-4df5-98b0-f01dbcfe83f7-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.373980 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.375360 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.381884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a0a07ea9-b7e5-4679-9b63-c6027e42f279-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.388359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzcgd\" (UniqueName: \"kubernetes.io/projected/b54655c1-accb-4df5-98b0-f01dbcfe83f7-kube-api-access-wzcgd\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.390668 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-247d9\" (UniqueName: \"kubernetes.io/projected/a0a07ea9-b7e5-4679-9b63-c6027e42f279-kube-api-access-247d9\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.405656 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-20552b06-44ca-49b7-b773-634d5b66321b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-20552b06-44ca-49b7-b773-634d5b66321b\") pod \"logging-loki-index-gateway-0\" (UID: \"b54655c1-accb-4df5-98b0-f01dbcfe83f7\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.406959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3cd47372-0856-425d-9bfa-2996bfa9f6b1\") pod \"logging-loki-compactor-0\" (UID: \"a0a07ea9-b7e5-4679-9b63-c6027e42f279\") " pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.435393 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.493231 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.493422 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.501191 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.522976 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" event={"ID":"9b90ec5a-5d56-4267-bb4e-1fbcdecff021","Type":"ContainerStarted","Data":"1d3fb4cc638aa68392b2709fc2de99654ee34c8d52d205e387940517a5986d51"} Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.531896 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" event={"ID":"e0bc0a8b-4a2d-4460-b984-7a2279e1a424","Type":"ContainerStarted","Data":"6f5bffd5a326c2c63a2c97e869e7a3b2bf84d8db25d57500a342bb6c1dd5f21a"} Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.537270 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" event={"ID":"75ce358f-5f03-401f-bdf8-27a7e5309227","Type":"ContainerStarted","Data":"a2e80c6bb637ad0456634db0d3c27bd7df1918c4e393d1223859b37a336df521"} Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.543652 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" event={"ID":"2ff291e5-8364-4627-be0f-51c9532e46ee","Type":"ContainerStarted","Data":"730a6cf85c75a72ace3d0e0a60c583d13f8de27b1b38ed8ffc1d78cdc0d2a459"} Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.546553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" event={"ID":"82e92071-6291-4ff2-971a-c658d2e001ed","Type":"ContainerStarted","Data":"b23e4c8dda81524b8891161ba7df393d50ae6a9c99d545df96509d5dc309742b"} Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.586101 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:43 crc kubenswrapper[4835]: I0319 09:38:43.738497 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 19 09:38:43 crc kubenswrapper[4835]: W0319 09:38:43.796412 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f256cf_c5bb_4e9b_aa0b_c1d1737c1d57.slice/crio-8e53ec4317d646e2719186b9813f2f22e83b473813afb39f2c6e4daa944c9a91 WatchSource:0}: Error finding container 8e53ec4317d646e2719186b9813f2f22e83b473813afb39f2c6e4daa944c9a91: Status 404 returned error can't find the container with id 8e53ec4317d646e2719186b9813f2f22e83b473813afb39f2c6e4daa944c9a91 Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.057900 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.138118 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 19 09:38:44 crc kubenswrapper[4835]: W0319 09:38:44.144612 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54655c1_accb_4df5_98b0_f01dbcfe83f7.slice/crio-59379b7a73aad80424c94717b24820671cc36b4e82b929a121cbdf4927937493 WatchSource:0}: Error finding container 59379b7a73aad80424c94717b24820671cc36b4e82b929a121cbdf4927937493: Status 404 returned error can't find the container with id 59379b7a73aad80424c94717b24820671cc36b4e82b929a121cbdf4927937493 Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.554108 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"b54655c1-accb-4df5-98b0-f01dbcfe83f7","Type":"ContainerStarted","Data":"59379b7a73aad80424c94717b24820671cc36b4e82b929a121cbdf4927937493"} Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.556469 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57","Type":"ContainerStarted","Data":"8e53ec4317d646e2719186b9813f2f22e83b473813afb39f2c6e4daa944c9a91"} Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.557806 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a0a07ea9-b7e5-4679-9b63-c6027e42f279","Type":"ContainerStarted","Data":"2d671c64bdc2137fc62c9ab975813989e431f52156e06247ded185e9a59d0400"} Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.616337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:44 crc kubenswrapper[4835]: I0319 09:38:44.661585 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:46 crc kubenswrapper[4835]: I0319 09:38:46.630284 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vrl7f" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="registry-server" containerID="cri-o://b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa" gracePeriod=2 Mar 19 09:38:46 crc kubenswrapper[4835]: E0319 09:38:46.831932 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbeb7b3_b185_41a6_a8fc_508a48c81dd8.slice/crio-b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.065928 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.134068 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2f8v\" (UniqueName: \"kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v\") pod \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.134106 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities\") pod \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.134266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content\") pod \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\" (UID: \"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8\") " Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.136818 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities" (OuterVolumeSpecName: "utilities") pod "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" (UID: "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.140869 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v" (OuterVolumeSpecName: "kube-api-access-g2f8v") pod "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" (UID: "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8"). InnerVolumeSpecName "kube-api-access-g2f8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.177790 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" (UID: "1bbeb7b3-b185-41a6-a8fc-508a48c81dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.235675 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.235709 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2f8v\" (UniqueName: \"kubernetes.io/projected/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-kube-api-access-g2f8v\") on node \"crc\" DevicePath \"\"" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.235719 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.642733 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" event={"ID":"75ce358f-5f03-401f-bdf8-27a7e5309227","Type":"ContainerStarted","Data":"7f5e0035c14a9d90821f9dcaa7b4fbd728c6e6bac5e9cbc9b0627478b58d11e9"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.646212 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" event={"ID":"2ff291e5-8364-4627-be0f-51c9532e46ee","Type":"ContainerStarted","Data":"dd0113f855b559e5f47acd0bb6a2879488de6f8564ee513ef380e961b92d178a"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.648697 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"b54655c1-accb-4df5-98b0-f01dbcfe83f7","Type":"ContainerStarted","Data":"27a2cfd3d7ca21fdf3953a3713fbda34687808d2ae3ebe6f1a8b6f0b02dd2ce5"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.648805 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.653027 4835 generic.go:334] "Generic (PLEG): container finished" podID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerID="b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa" exitCode=0 Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.653121 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrl7f" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.653140 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerDied","Data":"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.653175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrl7f" event={"ID":"1bbeb7b3-b185-41a6-a8fc-508a48c81dd8","Type":"ContainerDied","Data":"f8f7d857f1fa0749dd40c4525b9d37545253064133d2d5d5165cd12ae2f931ec"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.653207 4835 scope.go:117] "RemoveContainer" containerID="b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.657139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" event={"ID":"e0bc0a8b-4a2d-4460-b984-7a2279e1a424","Type":"ContainerStarted","Data":"d487d8c72852648e930223e7e6f274187f976b73e5d2b8573b1fa60f4abff642"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.657282 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.662642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57","Type":"ContainerStarted","Data":"2d1aed5f8a00203a1c4a4e918fcbbf00324560381d7653fdce4e09ce3a5f2976"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.663249 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.667116 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"a0a07ea9-b7e5-4679-9b63-c6027e42f279","Type":"ContainerStarted","Data":"6bafe0808fb9b9e6c0f5d28905c9bd62a5265166f2a191b0d77a03eb63b88157"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.667486 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.668858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" event={"ID":"82e92071-6291-4ff2-971a-c658d2e001ed","Type":"ContainerStarted","Data":"c085fa1ec651a809a36df7b589270bfb736c532768adb2e3da03f2c238155feb"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.669190 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.671014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" event={"ID":"9b90ec5a-5d56-4267-bb4e-1fbcdecff021","Type":"ContainerStarted","Data":"6541c17ffa7612a7b50119f155aee282e26b36d5eb96ccc784d53343563fa55a"} Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.671236 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.692180 4835 scope.go:117] "RemoveContainer" containerID="cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.702543 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.3962790910000002 podStartE2EDuration="5.702483157s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:44.147174418 +0000 UTC m=+978.995773005" lastFinishedPulling="2026-03-19 09:38:46.453378474 +0000 UTC m=+981.301977071" observedRunningTime="2026-03-19 09:38:47.679206671 +0000 UTC m=+982.527805308" watchObservedRunningTime="2026-03-19 09:38:47.702483157 +0000 UTC m=+982.551081784" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.710188 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podStartSLOduration=2.12780105 podStartE2EDuration="5.710165161s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:42.871047184 +0000 UTC m=+977.719645771" lastFinishedPulling="2026-03-19 09:38:46.453411295 +0000 UTC m=+981.302009882" observedRunningTime="2026-03-19 09:38:47.697981152 +0000 UTC m=+982.546579769" watchObservedRunningTime="2026-03-19 09:38:47.710165161 +0000 UTC m=+982.558763778" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.734869 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podStartSLOduration=2.1274595 podStartE2EDuration="5.734849356s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:42.8395609 +0000 UTC m=+977.688159487" lastFinishedPulling="2026-03-19 09:38:46.446950746 +0000 UTC m=+981.295549343" observedRunningTime="2026-03-19 09:38:47.72741561 +0000 UTC m=+982.576014207" watchObservedRunningTime="2026-03-19 09:38:47.734849356 +0000 UTC m=+982.583447953" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.749823 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.754187 4835 scope.go:117] "RemoveContainer" containerID="fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.772526 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podStartSLOduration=3.113638708 podStartE2EDuration="6.772509802s" podCreationTimestamp="2026-03-19 09:38:41 +0000 UTC" firstStartedPulling="2026-03-19 09:38:42.708243954 +0000 UTC m=+977.556842531" lastFinishedPulling="2026-03-19 09:38:46.367115028 +0000 UTC m=+981.215713625" observedRunningTime="2026-03-19 09:38:47.760841728 +0000 UTC m=+982.609440325" watchObservedRunningTime="2026-03-19 09:38:47.772509802 +0000 UTC m=+982.621108389" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.772802 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrl7f"] Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.792384 4835 scope.go:117] "RemoveContainer" containerID="b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa" Mar 19 09:38:47 crc kubenswrapper[4835]: E0319 09:38:47.799899 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa\": container with ID starting with b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa not found: ID does not exist" containerID="b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.799947 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa"} err="failed to get container status \"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa\": rpc error: code = NotFound desc = could not find container \"b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa\": container with ID starting with b1fe3e13822a0112721e4d62c73e4902473e06199fe12cfc6ad0365254fb53fa not found: ID does not exist" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.799975 4835 scope.go:117] "RemoveContainer" containerID="cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235" Mar 19 09:38:47 crc kubenswrapper[4835]: E0319 09:38:47.800426 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235\": container with ID starting with cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235 not found: ID does not exist" containerID="cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.800461 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235"} err="failed to get container status \"cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235\": rpc error: code = NotFound desc = could not find container \"cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235\": container with ID starting with cfa08e955ec6673e6842a73dc4592a83d22120a984779cdfbb0da42e2cb81235 not found: ID does not exist" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.800482 4835 scope.go:117] "RemoveContainer" containerID="fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92" Mar 19 09:38:47 crc kubenswrapper[4835]: E0319 09:38:47.800828 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92\": container with ID starting with fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92 not found: ID does not exist" containerID="fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.800901 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92"} err="failed to get container status \"fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92\": rpc error: code = NotFound desc = could not find container \"fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92\": container with ID starting with fb73248a002dd555c71e6bb42adc33f54ebb9b30333391c9898106959c439e92 not found: ID does not exist" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.812080 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=4.170454602 podStartE2EDuration="6.81206126s" podCreationTimestamp="2026-03-19 09:38:41 +0000 UTC" firstStartedPulling="2026-03-19 09:38:43.804605007 +0000 UTC m=+978.653203594" lastFinishedPulling="2026-03-19 09:38:46.446211665 +0000 UTC m=+981.294810252" observedRunningTime="2026-03-19 09:38:47.791109839 +0000 UTC m=+982.639708426" watchObservedRunningTime="2026-03-19 09:38:47.81206126 +0000 UTC m=+982.660659847" Mar 19 09:38:47 crc kubenswrapper[4835]: I0319 09:38:47.812187 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.443139043 podStartE2EDuration="5.812181243s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:44.079198251 +0000 UTC m=+978.927796838" lastFinishedPulling="2026-03-19 09:38:46.448240451 +0000 UTC m=+981.296839038" observedRunningTime="2026-03-19 09:38:47.810351763 +0000 UTC m=+982.658950350" watchObservedRunningTime="2026-03-19 09:38:47.812181243 +0000 UTC m=+982.660779830" Mar 19 09:38:48 crc kubenswrapper[4835]: I0319 09:38:48.413013 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" path="/var/lib/kubelet/pods/1bbeb7b3-b185-41a6-a8fc-508a48c81dd8/volumes" Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.693422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" event={"ID":"75ce358f-5f03-401f-bdf8-27a7e5309227","Type":"ContainerStarted","Data":"c1c298689befa0b91e97174781417e88f571fb60130552f2ca186ebf2ba549df"} Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.695912 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" event={"ID":"2ff291e5-8364-4627-be0f-51c9532e46ee","Type":"ContainerStarted","Data":"8401c37166c7850a93666282955a2ea10725ca9b6bdd70e7cf55ea5b97021376"} Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.696259 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.716355 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.723422 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podStartSLOduration=1.551129228 podStartE2EDuration="7.723406231s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:42.959890221 +0000 UTC m=+977.808488808" lastFinishedPulling="2026-03-19 09:38:49.132167214 +0000 UTC m=+983.980765811" observedRunningTime="2026-03-19 09:38:49.721661823 +0000 UTC m=+984.570260460" watchObservedRunningTime="2026-03-19 09:38:49.723406231 +0000 UTC m=+984.572004828" Mar 19 09:38:49 crc kubenswrapper[4835]: I0319 09:38:49.766158 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podStartSLOduration=1.506821609 podStartE2EDuration="7.766126628s" podCreationTimestamp="2026-03-19 09:38:42 +0000 UTC" firstStartedPulling="2026-03-19 09:38:42.8661691 +0000 UTC m=+977.714767687" lastFinishedPulling="2026-03-19 09:38:49.125474119 +0000 UTC m=+983.974072706" observedRunningTime="2026-03-19 09:38:49.757120147 +0000 UTC m=+984.605718744" watchObservedRunningTime="2026-03-19 09:38:49.766126628 +0000 UTC m=+984.614725255" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.707413 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.707936 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.707960 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.718679 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.722247 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" Mar 19 09:38:50 crc kubenswrapper[4835]: I0319 09:38:50.726416 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" Mar 19 09:39:02 crc kubenswrapper[4835]: I0319 09:39:02.227084 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 09:39:02 crc kubenswrapper[4835]: I0319 09:39:02.379669 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 09:39:02 crc kubenswrapper[4835]: I0319 09:39:02.440098 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 09:39:03 crc kubenswrapper[4835]: I0319 09:39:03.379412 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 19 09:39:03 crc kubenswrapper[4835]: I0319 09:39:03.379491 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 09:39:03 crc kubenswrapper[4835]: I0319 09:39:03.441692 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 19 09:39:03 crc kubenswrapper[4835]: I0319 09:39:03.508398 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 19 09:39:06 crc kubenswrapper[4835]: I0319 09:39:06.421953 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:39:06 crc kubenswrapper[4835]: I0319 09:39:06.422540 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:39:13 crc kubenswrapper[4835]: I0319 09:39:13.378091 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 19 09:39:13 crc kubenswrapper[4835]: I0319 09:39:13.378809 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.049000 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:21 crc kubenswrapper[4835]: E0319 09:39:21.049990 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="registry-server" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.050010 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="registry-server" Mar 19 09:39:21 crc kubenswrapper[4835]: E0319 09:39:21.050040 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="extract-utilities" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.050049 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="extract-utilities" Mar 19 09:39:21 crc kubenswrapper[4835]: E0319 09:39:21.050067 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="extract-content" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.050075 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="extract-content" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.050220 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bbeb7b3-b185-41a6-a8fc-508a48c81dd8" containerName="registry-server" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.051421 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.060838 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.132544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdltg\" (UniqueName: \"kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.132719 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.132922 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.239274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdltg\" (UniqueName: \"kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.239434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.239532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.240337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.241316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.263017 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdltg\" (UniqueName: \"kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg\") pod \"community-operators-94fb9\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.407042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.871957 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:21 crc kubenswrapper[4835]: I0319 09:39:21.985024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerStarted","Data":"d7983beb8328e70f324349d6164e8a680f24068ee3b6e4b4336ecaf256fef33d"} Mar 19 09:39:22 crc kubenswrapper[4835]: I0319 09:39:22.999017 4835 generic.go:334] "Generic (PLEG): container finished" podID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerID="8393afd5a369ad18ac9b9490e43276b8b8c7a0d245b046556213387c8fde5890" exitCode=0 Mar 19 09:39:22 crc kubenswrapper[4835]: I0319 09:39:22.999092 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerDied","Data":"8393afd5a369ad18ac9b9490e43276b8b8c7a0d245b046556213387c8fde5890"} Mar 19 09:39:23 crc kubenswrapper[4835]: I0319 09:39:23.373342 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 19 09:39:23 crc kubenswrapper[4835]: I0319 09:39:23.373406 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 09:39:24 crc kubenswrapper[4835]: I0319 09:39:24.008548 4835 generic.go:334] "Generic (PLEG): container finished" podID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerID="ea6703af152c4343ef76ce030b9a63db72c8384418fde62271b81a5d85fb57b4" exitCode=0 Mar 19 09:39:24 crc kubenswrapper[4835]: I0319 09:39:24.008607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerDied","Data":"ea6703af152c4343ef76ce030b9a63db72c8384418fde62271b81a5d85fb57b4"} Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.017821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerStarted","Data":"d3cebcebf6d09b7c4f4384c19aeb7e0c4713c110cde966f56f69a17f6f312c1f"} Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.045022 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-94fb9" podStartSLOduration=2.5449973630000002 podStartE2EDuration="4.044997023s" podCreationTimestamp="2026-03-19 09:39:21 +0000 UTC" firstStartedPulling="2026-03-19 09:39:23.001577284 +0000 UTC m=+1017.850175871" lastFinishedPulling="2026-03-19 09:39:24.501576924 +0000 UTC m=+1019.350175531" observedRunningTime="2026-03-19 09:39:25.040134267 +0000 UTC m=+1019.888732864" watchObservedRunningTime="2026-03-19 09:39:25.044997023 +0000 UTC m=+1019.893595630" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.239783 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.243931 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.251561 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.306251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.306338 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vfv\" (UniqueName: \"kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.306389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.408191 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vfv\" (UniqueName: \"kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.408262 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.408336 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.408796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.409181 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.435386 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vfv\" (UniqueName: \"kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv\") pod \"certified-operators-h4rb7\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:25 crc kubenswrapper[4835]: I0319 09:39:25.569654 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:26 crc kubenswrapper[4835]: I0319 09:39:26.082506 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:26 crc kubenswrapper[4835]: W0319 09:39:26.088193 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb44ba3_be2c_4a74_aefd_53493fd0b3b6.slice/crio-e23c8a6461faaca7bd736ed1499eee6ba779cee66a27cc1f986d84e7137b6cf9 WatchSource:0}: Error finding container e23c8a6461faaca7bd736ed1499eee6ba779cee66a27cc1f986d84e7137b6cf9: Status 404 returned error can't find the container with id e23c8a6461faaca7bd736ed1499eee6ba779cee66a27cc1f986d84e7137b6cf9 Mar 19 09:39:27 crc kubenswrapper[4835]: I0319 09:39:27.032074 4835 generic.go:334] "Generic (PLEG): container finished" podID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerID="2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8" exitCode=0 Mar 19 09:39:27 crc kubenswrapper[4835]: I0319 09:39:27.032177 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerDied","Data":"2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8"} Mar 19 09:39:27 crc kubenswrapper[4835]: I0319 09:39:27.032345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerStarted","Data":"e23c8a6461faaca7bd736ed1499eee6ba779cee66a27cc1f986d84e7137b6cf9"} Mar 19 09:39:29 crc kubenswrapper[4835]: I0319 09:39:29.053827 4835 generic.go:334] "Generic (PLEG): container finished" podID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerID="9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1" exitCode=0 Mar 19 09:39:29 crc kubenswrapper[4835]: I0319 09:39:29.054073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerDied","Data":"9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1"} Mar 19 09:39:30 crc kubenswrapper[4835]: I0319 09:39:30.064710 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerStarted","Data":"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1"} Mar 19 09:39:30 crc kubenswrapper[4835]: I0319 09:39:30.082045 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h4rb7" podStartSLOduration=2.633061444 podStartE2EDuration="5.082025384s" podCreationTimestamp="2026-03-19 09:39:25 +0000 UTC" firstStartedPulling="2026-03-19 09:39:27.034024401 +0000 UTC m=+1021.882622998" lastFinishedPulling="2026-03-19 09:39:29.482988311 +0000 UTC m=+1024.331586938" observedRunningTime="2026-03-19 09:39:30.079946766 +0000 UTC m=+1024.928545413" watchObservedRunningTime="2026-03-19 09:39:30.082025384 +0000 UTC m=+1024.930623981" Mar 19 09:39:31 crc kubenswrapper[4835]: I0319 09:39:31.407538 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:31 crc kubenswrapper[4835]: I0319 09:39:31.407952 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:31 crc kubenswrapper[4835]: I0319 09:39:31.454382 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:32 crc kubenswrapper[4835]: I0319 09:39:32.145703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:32 crc kubenswrapper[4835]: I0319 09:39:32.623297 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:33 crc kubenswrapper[4835]: I0319 09:39:33.376866 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 19 09:39:33 crc kubenswrapper[4835]: I0319 09:39:33.377423 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 19 09:39:34 crc kubenswrapper[4835]: I0319 09:39:34.106770 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-94fb9" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="registry-server" containerID="cri-o://d3cebcebf6d09b7c4f4384c19aeb7e0c4713c110cde966f56f69a17f6f312c1f" gracePeriod=2 Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.116989 4835 generic.go:334] "Generic (PLEG): container finished" podID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerID="d3cebcebf6d09b7c4f4384c19aeb7e0c4713c110cde966f56f69a17f6f312c1f" exitCode=0 Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.117072 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerDied","Data":"d3cebcebf6d09b7c4f4384c19aeb7e0c4713c110cde966f56f69a17f6f312c1f"} Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.570278 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.571492 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.625480 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.840517 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.894559 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content\") pod \"da82d898-ab8d-479e-8f76-60ef5cf320df\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.894655 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdltg\" (UniqueName: \"kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg\") pod \"da82d898-ab8d-479e-8f76-60ef5cf320df\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.894731 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities\") pod \"da82d898-ab8d-479e-8f76-60ef5cf320df\" (UID: \"da82d898-ab8d-479e-8f76-60ef5cf320df\") " Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.896955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities" (OuterVolumeSpecName: "utilities") pod "da82d898-ab8d-479e-8f76-60ef5cf320df" (UID: "da82d898-ab8d-479e-8f76-60ef5cf320df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.901848 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg" (OuterVolumeSpecName: "kube-api-access-rdltg") pod "da82d898-ab8d-479e-8f76-60ef5cf320df" (UID: "da82d898-ab8d-479e-8f76-60ef5cf320df"). InnerVolumeSpecName "kube-api-access-rdltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.948864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da82d898-ab8d-479e-8f76-60ef5cf320df" (UID: "da82d898-ab8d-479e-8f76-60ef5cf320df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.996944 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.996982 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdltg\" (UniqueName: \"kubernetes.io/projected/da82d898-ab8d-479e-8f76-60ef5cf320df-kube-api-access-rdltg\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:35 crc kubenswrapper[4835]: I0319 09:39:35.996994 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da82d898-ab8d-479e-8f76-60ef5cf320df-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.124923 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-94fb9" event={"ID":"da82d898-ab8d-479e-8f76-60ef5cf320df","Type":"ContainerDied","Data":"d7983beb8328e70f324349d6164e8a680f24068ee3b6e4b4336ecaf256fef33d"} Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.124999 4835 scope.go:117] "RemoveContainer" containerID="d3cebcebf6d09b7c4f4384c19aeb7e0c4713c110cde966f56f69a17f6f312c1f" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.124945 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-94fb9" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.139427 4835 scope.go:117] "RemoveContainer" containerID="ea6703af152c4343ef76ce030b9a63db72c8384418fde62271b81a5d85fb57b4" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.161886 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.161970 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.173538 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-94fb9"] Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.174903 4835 scope.go:117] "RemoveContainer" containerID="8393afd5a369ad18ac9b9490e43276b8b8c7a0d245b046556213387c8fde5890" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.415000 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" path="/var/lib/kubelet/pods/da82d898-ab8d-479e-8f76-60ef5cf320df/volumes" Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.421987 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:39:36 crc kubenswrapper[4835]: I0319 09:39:36.422055 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:39:38 crc kubenswrapper[4835]: I0319 09:39:38.427822 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.148001 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h4rb7" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="registry-server" containerID="cri-o://6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1" gracePeriod=2 Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.555603 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.651970 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities\") pod \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.652037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4vfv\" (UniqueName: \"kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv\") pod \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.652115 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content\") pod \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\" (UID: \"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6\") " Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.653085 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities" (OuterVolumeSpecName: "utilities") pod "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" (UID: "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.657016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv" (OuterVolumeSpecName: "kube-api-access-m4vfv") pod "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" (UID: "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6"). InnerVolumeSpecName "kube-api-access-m4vfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.706697 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" (UID: "3bb44ba3-be2c-4a74-aefd-53493fd0b3b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.752988 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.753024 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:39 crc kubenswrapper[4835]: I0319 09:39:39.753072 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4vfv\" (UniqueName: \"kubernetes.io/projected/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6-kube-api-access-m4vfv\") on node \"crc\" DevicePath \"\"" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.163544 4835 generic.go:334] "Generic (PLEG): container finished" podID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerID="6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1" exitCode=0 Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.163805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerDied","Data":"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1"} Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.164191 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h4rb7" event={"ID":"3bb44ba3-be2c-4a74-aefd-53493fd0b3b6","Type":"ContainerDied","Data":"e23c8a6461faaca7bd736ed1499eee6ba779cee66a27cc1f986d84e7137b6cf9"} Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.163905 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h4rb7" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.164240 4835 scope.go:117] "RemoveContainer" containerID="6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.216947 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.220238 4835 scope.go:117] "RemoveContainer" containerID="9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.224468 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h4rb7"] Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.248427 4835 scope.go:117] "RemoveContainer" containerID="2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.285603 4835 scope.go:117] "RemoveContainer" containerID="6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1" Mar 19 09:39:40 crc kubenswrapper[4835]: E0319 09:39:40.286224 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1\": container with ID starting with 6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1 not found: ID does not exist" containerID="6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.286255 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1"} err="failed to get container status \"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1\": rpc error: code = NotFound desc = could not find container \"6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1\": container with ID starting with 6605ddab0c5b9cc278f2aecfffb3dc8c9e9932e46132573dfa0491638d7182c1 not found: ID does not exist" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.286279 4835 scope.go:117] "RemoveContainer" containerID="9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1" Mar 19 09:39:40 crc kubenswrapper[4835]: E0319 09:39:40.286571 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1\": container with ID starting with 9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1 not found: ID does not exist" containerID="9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.286600 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1"} err="failed to get container status \"9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1\": rpc error: code = NotFound desc = could not find container \"9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1\": container with ID starting with 9d1f588bc518bbcfee6323cdb9c787a926c29d5fd0ca784600e6e6ed027accc1 not found: ID does not exist" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.286617 4835 scope.go:117] "RemoveContainer" containerID="2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8" Mar 19 09:39:40 crc kubenswrapper[4835]: E0319 09:39:40.286949 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8\": container with ID starting with 2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8 not found: ID does not exist" containerID="2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.286975 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8"} err="failed to get container status \"2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8\": rpc error: code = NotFound desc = could not find container \"2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8\": container with ID starting with 2150c3b09a4680aef24dc52f77353795d956c8c5c7103ad6afa2b44aaf2d93f8 not found: ID does not exist" Mar 19 09:39:40 crc kubenswrapper[4835]: I0319 09:39:40.425285 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" path="/var/lib/kubelet/pods/3bb44ba3-be2c-4a74-aefd-53493fd0b3b6/volumes" Mar 19 09:39:43 crc kubenswrapper[4835]: I0319 09:39:43.378042 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.149431 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565220-dvxr7"] Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150411 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="extract-utilities" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150427 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="extract-utilities" Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150444 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="extract-utilities" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150453 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="extract-utilities" Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150478 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150487 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150499 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150507 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150523 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="extract-content" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150531 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="extract-content" Mar 19 09:40:00 crc kubenswrapper[4835]: E0319 09:40:00.150558 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="extract-content" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150568 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="extract-content" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150721 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="da82d898-ab8d-479e-8f76-60ef5cf320df" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.150737 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb44ba3-be2c-4a74-aefd-53493fd0b3b6" containerName="registry-server" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.151433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.153868 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.154926 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.155549 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.167883 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565220-dvxr7"] Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.209005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59mf\" (UniqueName: \"kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf\") pod \"auto-csr-approver-29565220-dvxr7\" (UID: \"064a6812-d853-4ec4-9c7e-9de71456672b\") " pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.309960 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59mf\" (UniqueName: \"kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf\") pod \"auto-csr-approver-29565220-dvxr7\" (UID: \"064a6812-d853-4ec4-9c7e-9de71456672b\") " pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.342913 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59mf\" (UniqueName: \"kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf\") pod \"auto-csr-approver-29565220-dvxr7\" (UID: \"064a6812-d853-4ec4-9c7e-9de71456672b\") " pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.468135 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:00 crc kubenswrapper[4835]: I0319 09:40:00.883888 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565220-dvxr7"] Mar 19 09:40:00 crc kubenswrapper[4835]: W0319 09:40:00.897891 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod064a6812_d853_4ec4_9c7e_9de71456672b.slice/crio-7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c WatchSource:0}: Error finding container 7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c: Status 404 returned error can't find the container with id 7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c Mar 19 09:40:01 crc kubenswrapper[4835]: I0319 09:40:01.357413 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" event={"ID":"064a6812-d853-4ec4-9c7e-9de71456672b","Type":"ContainerStarted","Data":"7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c"} Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.365619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" event={"ID":"064a6812-d853-4ec4-9c7e-9de71456672b","Type":"ContainerStarted","Data":"a5740e3c8b5d0c8823d092baafda22605046c988cb61715f4229d4a9f01f5a9b"} Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.384501 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" podStartSLOduration=1.322538987 podStartE2EDuration="2.384477323s" podCreationTimestamp="2026-03-19 09:40:00 +0000 UTC" firstStartedPulling="2026-03-19 09:40:00.901566138 +0000 UTC m=+1055.750164745" lastFinishedPulling="2026-03-19 09:40:01.963504454 +0000 UTC m=+1056.812103081" observedRunningTime="2026-03-19 09:40:02.380359799 +0000 UTC m=+1057.228958386" watchObservedRunningTime="2026-03-19 09:40:02.384477323 +0000 UTC m=+1057.233075920" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.479056 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-rxpth"] Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.480653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.487302 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.487550 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.487666 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.487851 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-nlzv6" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.488021 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.495934 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.499607 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-rxpth"] Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564182 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564631 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564792 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.564946 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.565005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9ddj\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.565075 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.644280 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-rxpth"] Mar 19 09:40:02 crc kubenswrapper[4835]: E0319 09:40:02.644797 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-t9ddj metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-rxpth" podUID="ce374e56-b9ca-4d92-89e5-31ade726d265" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.666367 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.666498 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.666526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.666601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: E0319 09:40:02.666705 4835 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 19 09:40:02 crc kubenswrapper[4835]: E0319 09:40:02.666805 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver podName:ce374e56-b9ca-4d92-89e5-31ade726d265 nodeName:}" failed. No retries permitted until 2026-03-19 09:40:03.166789322 +0000 UTC m=+1058.015387909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver") pod "collector-rxpth" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265") : secret "collector-syslog-receiver" not found Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.667976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668224 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: E0319 09:40:02.668343 4835 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 19 09:40:02 crc kubenswrapper[4835]: E0319 09:40:02.668373 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics podName:ce374e56-b9ca-4d92-89e5-31ade726d265 nodeName:}" failed. No retries permitted until 2026-03-19 09:40:03.168365655 +0000 UTC m=+1058.016964242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics") pod "collector-rxpth" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265") : secret "collector-metrics" not found Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668388 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.668416 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9ddj\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.669279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.669928 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.670672 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.671156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.671466 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.684907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9ddj\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.687279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:02 crc kubenswrapper[4835]: I0319 09:40:02.688218 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.172793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.172920 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: E0319 09:40:03.173008 4835 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 19 09:40:03 crc kubenswrapper[4835]: E0319 09:40:03.173100 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics podName:ce374e56-b9ca-4d92-89e5-31ade726d265 nodeName:}" failed. No retries permitted until 2026-03-19 09:40:04.173075949 +0000 UTC m=+1059.021674546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics") pod "collector-rxpth" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265") : secret "collector-metrics" not found Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.179305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.377289 4835 generic.go:334] "Generic (PLEG): container finished" podID="064a6812-d853-4ec4-9c7e-9de71456672b" containerID="a5740e3c8b5d0c8823d092baafda22605046c988cb61715f4229d4a9f01f5a9b" exitCode=0 Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.377334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" event={"ID":"064a6812-d853-4ec4-9c7e-9de71456672b","Type":"ContainerDied","Data":"a5740e3c8b5d0c8823d092baafda22605046c988cb61715f4229d4a9f01f5a9b"} Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.377397 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.386940 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rxpth" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578228 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578319 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578379 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578471 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578555 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9ddj\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578709 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578781 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578889 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.578972 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.579003 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.579688 4835 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.580435 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir" (OuterVolumeSpecName: "datadir") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.580704 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.581016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config" (OuterVolumeSpecName: "config") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.581207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.582259 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token" (OuterVolumeSpecName: "sa-token") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.582724 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token" (OuterVolumeSpecName: "collector-token") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.582984 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.583569 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp" (OuterVolumeSpecName: "tmp") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.583795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj" (OuterVolumeSpecName: "kube-api-access-t9ddj") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "kube-api-access-t9ddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681280 4835 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681674 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681690 4835 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce374e56-b9ca-4d92-89e5-31ade726d265-tmp\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681703 4835 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/ce374e56-b9ca-4d92-89e5-31ade726d265-datadir\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681715 4835 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681728 4835 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681766 4835 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681781 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce374e56-b9ca-4d92-89e5-31ade726d265-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:03 crc kubenswrapper[4835]: I0319 09:40:03.681794 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9ddj\" (UniqueName: \"kubernetes.io/projected/ce374e56-b9ca-4d92-89e5-31ade726d265-kube-api-access-t9ddj\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.208536 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.216315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"collector-rxpth\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " pod="openshift-logging/collector-rxpth" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.310210 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") pod \"ce374e56-b9ca-4d92-89e5-31ade726d265\" (UID: \"ce374e56-b9ca-4d92-89e5-31ade726d265\") " Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.314023 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics" (OuterVolumeSpecName: "metrics") pod "ce374e56-b9ca-4d92-89e5-31ade726d265" (UID: "ce374e56-b9ca-4d92-89e5-31ade726d265"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.383687 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rxpth" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.414166 4835 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/ce374e56-b9ca-4d92-89e5-31ade726d265-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.440929 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-rxpth"] Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.451713 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-rxpth"] Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.457071 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-txhkk"] Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.458167 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.468355 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.468373 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-nlzv6" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.468689 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.468922 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.471909 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.473719 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.475157 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-txhkk"] Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.514714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config-openshift-service-cacrt\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.514792 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-metrics\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.514823 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.514859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84ff8168-f128-476f-9eef-a5b976025bed-tmp\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515423 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-sa-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515458 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/84ff8168-f128-476f-9eef-a5b976025bed-datadir\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515507 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2h7g\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-kube-api-access-q2h7g\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-entrypoint\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515573 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-trusted-ca\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.515657 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-syslog-receiver\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.617431 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-metrics\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.617849 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.617893 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.617935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84ff8168-f128-476f-9eef-a5b976025bed-tmp\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-sa-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/84ff8168-f128-476f-9eef-a5b976025bed-datadir\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2h7g\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-kube-api-access-q2h7g\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618091 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-entrypoint\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-trusted-ca\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618165 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-syslog-receiver\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618198 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config-openshift-service-cacrt\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618937 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/84ff8168-f128-476f-9eef-a5b976025bed-datadir\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.618988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config-openshift-service-cacrt\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.620382 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-trusted-ca\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.620587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-entrypoint\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.621962 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84ff8168-f128-476f-9eef-a5b976025bed-config\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.623623 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84ff8168-f128-476f-9eef-a5b976025bed-tmp\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.631486 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-metrics\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.631826 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.632445 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/84ff8168-f128-476f-9eef-a5b976025bed-collector-syslog-receiver\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.636021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-sa-token\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.640094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2h7g\" (UniqueName: \"kubernetes.io/projected/84ff8168-f128-476f-9eef-a5b976025bed-kube-api-access-q2h7g\") pod \"collector-txhkk\" (UID: \"84ff8168-f128-476f-9eef-a5b976025bed\") " pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.703753 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.719459 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h59mf\" (UniqueName: \"kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf\") pod \"064a6812-d853-4ec4-9c7e-9de71456672b\" (UID: \"064a6812-d853-4ec4-9c7e-9de71456672b\") " Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.723916 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf" (OuterVolumeSpecName: "kube-api-access-h59mf") pod "064a6812-d853-4ec4-9c7e-9de71456672b" (UID: "064a6812-d853-4ec4-9c7e-9de71456672b"). InnerVolumeSpecName "kube-api-access-h59mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.815461 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-txhkk" Mar 19 09:40:04 crc kubenswrapper[4835]: I0319 09:40:04.821084 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h59mf\" (UniqueName: \"kubernetes.io/projected/064a6812-d853-4ec4-9c7e-9de71456672b-kube-api-access-h59mf\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.109984 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-txhkk"] Mar 19 09:40:05 crc kubenswrapper[4835]: W0319 09:40:05.128252 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ff8168_f128_476f_9eef_a5b976025bed.slice/crio-fa79bc98b2e2f7b54edaa987a2f3d0ea3bacf5c957d95839433afa722c434891 WatchSource:0}: Error finding container fa79bc98b2e2f7b54edaa987a2f3d0ea3bacf5c957d95839433afa722c434891: Status 404 returned error can't find the container with id fa79bc98b2e2f7b54edaa987a2f3d0ea3bacf5c957d95839433afa722c434891 Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.399373 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.399588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565220-dvxr7" event={"ID":"064a6812-d853-4ec4-9c7e-9de71456672b","Type":"ContainerDied","Data":"7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c"} Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.399793 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7725940a8bafe25392d0aa31f13293b645c734cd86767dbbaf65fc5501aec69c" Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.400820 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-txhkk" event={"ID":"84ff8168-f128-476f-9eef-a5b976025bed","Type":"ContainerStarted","Data":"fa79bc98b2e2f7b54edaa987a2f3d0ea3bacf5c957d95839433afa722c434891"} Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.435853 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565214-zs45x"] Mar 19 09:40:05 crc kubenswrapper[4835]: I0319 09:40:05.443036 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565214-zs45x"] Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.418193 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e011a6-0f30-4160-8740-591cd740d2c5" path="/var/lib/kubelet/pods/15e011a6-0f30-4160-8740-591cd740d2c5/volumes" Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.420240 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce374e56-b9ca-4d92-89e5-31ade726d265" path="/var/lib/kubelet/pods/ce374e56-b9ca-4d92-89e5-31ade726d265/volumes" Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.422487 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.422566 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.422632 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.423848 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:40:06 crc kubenswrapper[4835]: I0319 09:40:06.424065 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b" gracePeriod=600 Mar 19 09:40:07 crc kubenswrapper[4835]: I0319 09:40:07.420059 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b" exitCode=0 Mar 19 09:40:07 crc kubenswrapper[4835]: I0319 09:40:07.420112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b"} Mar 19 09:40:07 crc kubenswrapper[4835]: I0319 09:40:07.420154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f"} Mar 19 09:40:07 crc kubenswrapper[4835]: I0319 09:40:07.420175 4835 scope.go:117] "RemoveContainer" containerID="8048ba6999fcc9a0f2b426a9033f119cd447f58778521ba0373e7a0bc81270c5" Mar 19 09:40:09 crc kubenswrapper[4835]: I0319 09:40:09.440398 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-txhkk" event={"ID":"84ff8168-f128-476f-9eef-a5b976025bed","Type":"ContainerStarted","Data":"41b2669cc23bd0f177a7e12f38a885729663d2f6552c95f34882900e32d1d711"} Mar 19 09:40:09 crc kubenswrapper[4835]: I0319 09:40:09.466915 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-txhkk" podStartSLOduration=1.820993412 podStartE2EDuration="5.466890857s" podCreationTimestamp="2026-03-19 09:40:04 +0000 UTC" firstStartedPulling="2026-03-19 09:40:05.13049093 +0000 UTC m=+1059.979089527" lastFinishedPulling="2026-03-19 09:40:08.776388365 +0000 UTC m=+1063.624986972" observedRunningTime="2026-03-19 09:40:09.465436857 +0000 UTC m=+1064.314035454" watchObservedRunningTime="2026-03-19 09:40:09.466890857 +0000 UTC m=+1064.315489464" Mar 19 09:40:27 crc kubenswrapper[4835]: I0319 09:40:27.366088 4835 scope.go:117] "RemoveContainer" containerID="b91ff655aba92781f9f919767c85bc6c4185fb285487606b19627243eb45cbf4" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.515989 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk"] Mar 19 09:40:45 crc kubenswrapper[4835]: E0319 09:40:45.518650 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064a6812-d853-4ec4-9c7e-9de71456672b" containerName="oc" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.518809 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="064a6812-d853-4ec4-9c7e-9de71456672b" containerName="oc" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.519093 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="064a6812-d853-4ec4-9c7e-9de71456672b" containerName="oc" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.520379 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.522941 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk"] Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.528866 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.619646 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthmb\" (UniqueName: \"kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.619740 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.619825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.720988 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.721099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.721152 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthmb\" (UniqueName: \"kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.722000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.722080 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.743103 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthmb\" (UniqueName: \"kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:45 crc kubenswrapper[4835]: I0319 09:40:45.891917 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:46 crc kubenswrapper[4835]: I0319 09:40:46.303870 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk"] Mar 19 09:40:46 crc kubenswrapper[4835]: I0319 09:40:46.726066 4835 generic.go:334] "Generic (PLEG): container finished" podID="86081666-b147-49d0-bed1-df369027ca65" containerID="e820abc498f6bf5118167bc0a4fbadbddf5f7aa981a0cd70d92064d96c3ea842" exitCode=0 Mar 19 09:40:46 crc kubenswrapper[4835]: I0319 09:40:46.726118 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" event={"ID":"86081666-b147-49d0-bed1-df369027ca65","Type":"ContainerDied","Data":"e820abc498f6bf5118167bc0a4fbadbddf5f7aa981a0cd70d92064d96c3ea842"} Mar 19 09:40:46 crc kubenswrapper[4835]: I0319 09:40:46.726345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" event={"ID":"86081666-b147-49d0-bed1-df369027ca65","Type":"ContainerStarted","Data":"a9eb206445de03360eabf2073f1ab8db39215183f3682db83d1a8615b82ff4a2"} Mar 19 09:40:48 crc kubenswrapper[4835]: I0319 09:40:48.742422 4835 generic.go:334] "Generic (PLEG): container finished" podID="86081666-b147-49d0-bed1-df369027ca65" containerID="97b837746d007967261449671e2df35e779561ece0105e17964babe27f536020" exitCode=0 Mar 19 09:40:48 crc kubenswrapper[4835]: I0319 09:40:48.742470 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" event={"ID":"86081666-b147-49d0-bed1-df369027ca65","Type":"ContainerDied","Data":"97b837746d007967261449671e2df35e779561ece0105e17964babe27f536020"} Mar 19 09:40:49 crc kubenswrapper[4835]: I0319 09:40:49.750528 4835 generic.go:334] "Generic (PLEG): container finished" podID="86081666-b147-49d0-bed1-df369027ca65" containerID="34e8add3bb7706586df74bfc6269439a0475ce09c788498323f44db5f2f40ab7" exitCode=0 Mar 19 09:40:49 crc kubenswrapper[4835]: I0319 09:40:49.750961 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" event={"ID":"86081666-b147-49d0-bed1-df369027ca65","Type":"ContainerDied","Data":"34e8add3bb7706586df74bfc6269439a0475ce09c788498323f44db5f2f40ab7"} Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.126428 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.220161 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle\") pod \"86081666-b147-49d0-bed1-df369027ca65\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.220786 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle" (OuterVolumeSpecName: "bundle") pod "86081666-b147-49d0-bed1-df369027ca65" (UID: "86081666-b147-49d0-bed1-df369027ca65"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.321347 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xthmb\" (UniqueName: \"kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb\") pod \"86081666-b147-49d0-bed1-df369027ca65\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.321401 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util\") pod \"86081666-b147-49d0-bed1-df369027ca65\" (UID: \"86081666-b147-49d0-bed1-df369027ca65\") " Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.321665 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.326479 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb" (OuterVolumeSpecName: "kube-api-access-xthmb") pod "86081666-b147-49d0-bed1-df369027ca65" (UID: "86081666-b147-49d0-bed1-df369027ca65"). InnerVolumeSpecName "kube-api-access-xthmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.337930 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util" (OuterVolumeSpecName: "util") pod "86081666-b147-49d0-bed1-df369027ca65" (UID: "86081666-b147-49d0-bed1-df369027ca65"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.424315 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xthmb\" (UniqueName: \"kubernetes.io/projected/86081666-b147-49d0-bed1-df369027ca65-kube-api-access-xthmb\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.424364 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86081666-b147-49d0-bed1-df369027ca65-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.767686 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" event={"ID":"86081666-b147-49d0-bed1-df369027ca65","Type":"ContainerDied","Data":"a9eb206445de03360eabf2073f1ab8db39215183f3682db83d1a8615b82ff4a2"} Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.768039 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9eb206445de03360eabf2073f1ab8db39215183f3682db83d1a8615b82ff4a2" Mar 19 09:40:51 crc kubenswrapper[4835]: I0319 09:40:51.767761 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.427599 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt"] Mar 19 09:40:54 crc kubenswrapper[4835]: E0319 09:40:54.428300 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="util" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.428318 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="util" Mar 19 09:40:54 crc kubenswrapper[4835]: E0319 09:40:54.428337 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="pull" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.428345 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="pull" Mar 19 09:40:54 crc kubenswrapper[4835]: E0319 09:40:54.428356 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="extract" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.428363 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="extract" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.428478 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="86081666-b147-49d0-bed1-df369027ca65" containerName="extract" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.429050 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.431715 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.432397 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.432441 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ddrlq" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.437724 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt"] Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.470441 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmklh\" (UniqueName: \"kubernetes.io/projected/74033176-5a6a-4727-9f45-9083cb1dbd3e-kube-api-access-nmklh\") pod \"nmstate-operator-796d4cfff4-m8zlt\" (UID: \"74033176-5a6a-4727-9f45-9083cb1dbd3e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.572425 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmklh\" (UniqueName: \"kubernetes.io/projected/74033176-5a6a-4727-9f45-9083cb1dbd3e-kube-api-access-nmklh\") pod \"nmstate-operator-796d4cfff4-m8zlt\" (UID: \"74033176-5a6a-4727-9f45-9083cb1dbd3e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.592268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmklh\" (UniqueName: \"kubernetes.io/projected/74033176-5a6a-4727-9f45-9083cb1dbd3e-kube-api-access-nmklh\") pod \"nmstate-operator-796d4cfff4-m8zlt\" (UID: \"74033176-5a6a-4727-9f45-9083cb1dbd3e\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" Mar 19 09:40:54 crc kubenswrapper[4835]: I0319 09:40:54.747603 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" Mar 19 09:40:55 crc kubenswrapper[4835]: I0319 09:40:55.257635 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt"] Mar 19 09:40:55 crc kubenswrapper[4835]: W0319 09:40:55.261671 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74033176_5a6a_4727_9f45_9083cb1dbd3e.slice/crio-87d51f34bf4c52486312d9e0e285e9d9dda331816560aceab491a628254c5919 WatchSource:0}: Error finding container 87d51f34bf4c52486312d9e0e285e9d9dda331816560aceab491a628254c5919: Status 404 returned error can't find the container with id 87d51f34bf4c52486312d9e0e285e9d9dda331816560aceab491a628254c5919 Mar 19 09:40:55 crc kubenswrapper[4835]: I0319 09:40:55.797010 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" event={"ID":"74033176-5a6a-4727-9f45-9083cb1dbd3e","Type":"ContainerStarted","Data":"87d51f34bf4c52486312d9e0e285e9d9dda331816560aceab491a628254c5919"} Mar 19 09:40:58 crc kubenswrapper[4835]: I0319 09:40:58.816032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" event={"ID":"74033176-5a6a-4727-9f45-9083cb1dbd3e","Type":"ContainerStarted","Data":"1365e769037b6472bba1dd98045dc036735359024d04f887f949a9597be91d81"} Mar 19 09:40:58 crc kubenswrapper[4835]: I0319 09:40:58.833157 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-m8zlt" podStartSLOduration=2.086491835 podStartE2EDuration="4.83313494s" podCreationTimestamp="2026-03-19 09:40:54 +0000 UTC" firstStartedPulling="2026-03-19 09:40:55.263527143 +0000 UTC m=+1110.112125730" lastFinishedPulling="2026-03-19 09:40:58.010170248 +0000 UTC m=+1112.858768835" observedRunningTime="2026-03-19 09:40:58.829512133 +0000 UTC m=+1113.678110720" watchObservedRunningTime="2026-03-19 09:40:58.83313494 +0000 UTC m=+1113.681733527" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.835580 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d"] Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.837070 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.840346 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zgrls" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.847088 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d"] Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.864171 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp"] Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.865161 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.871724 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.871759 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-69n76"] Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.873292 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:40:59 crc kubenswrapper[4835]: I0319 09:40:59.912467 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027224 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-ovs-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4tr\" (UniqueName: \"kubernetes.io/projected/514e58bd-bd21-45bf-8543-90439bf1c630-kube-api-access-wt4tr\") pod \"nmstate-metrics-9b8c8685d-8gq4d\" (UID: \"514e58bd-bd21-45bf-8543-90439bf1c630\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027321 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vs8c\" (UniqueName: \"kubernetes.io/projected/a5ae9a03-b125-4072-9ec8-fddd19b7002e-kube-api-access-5vs8c\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-nmstate-lock\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027430 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-dbus-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pj2f\" (UniqueName: \"kubernetes.io/projected/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-kube-api-access-5pj2f\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.027482 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.039874 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.041021 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.043347 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.043525 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mllbd" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.052056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.060970 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-ovs-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4tr\" (UniqueName: \"kubernetes.io/projected/514e58bd-bd21-45bf-8543-90439bf1c630-kube-api-access-wt4tr\") pod \"nmstate-metrics-9b8c8685d-8gq4d\" (UID: \"514e58bd-bd21-45bf-8543-90439bf1c630\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vs8c\" (UniqueName: \"kubernetes.io/projected/a5ae9a03-b125-4072-9ec8-fddd19b7002e-kube-api-access-5vs8c\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129221 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-nmstate-lock\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129301 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-dbus-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pj2f\" (UniqueName: \"kubernetes.io/projected/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-kube-api-access-5pj2f\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129353 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: E0319 09:41:00.129496 4835 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 09:41:00 crc kubenswrapper[4835]: E0319 09:41:00.129553 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair podName:a5ae9a03-b125-4072-9ec8-fddd19b7002e nodeName:}" failed. No retries permitted until 2026-03-19 09:41:00.62953336 +0000 UTC m=+1115.478131947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair") pod "nmstate-webhook-5f558f5558-zpbgp" (UID: "a5ae9a03-b125-4072-9ec8-fddd19b7002e") : secret "openshift-nmstate-webhook" not found Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.129805 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-ovs-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.130158 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-nmstate-lock\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.130400 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-dbus-socket\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.155140 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4tr\" (UniqueName: \"kubernetes.io/projected/514e58bd-bd21-45bf-8543-90439bf1c630-kube-api-access-wt4tr\") pod \"nmstate-metrics-9b8c8685d-8gq4d\" (UID: \"514e58bd-bd21-45bf-8543-90439bf1c630\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.155440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.156398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vs8c\" (UniqueName: \"kubernetes.io/projected/a5ae9a03-b125-4072-9ec8-fddd19b7002e-kube-api-access-5vs8c\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.162337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pj2f\" (UniqueName: \"kubernetes.io/projected/86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13-kube-api-access-5pj2f\") pod \"nmstate-handler-69n76\" (UID: \"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13\") " pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.196611 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.233653 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.233857 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.233956 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bgkk\" (UniqueName: \"kubernetes.io/projected/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-kube-api-access-2bgkk\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.294801 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.300187 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.316136 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.335995 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336056 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336086 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw5f7\" (UniqueName: \"kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336149 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336186 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bgkk\" (UniqueName: \"kubernetes.io/projected/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-kube-api-access-2bgkk\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336251 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.336274 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: E0319 09:41:00.336397 4835 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 09:41:00 crc kubenswrapper[4835]: E0319 09:41:00.336437 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert podName:9b6aeb7d-73be-42b9-93ef-72be8a26f4f7 nodeName:}" failed. No retries permitted until 2026-03-19 09:41:00.836421821 +0000 UTC m=+1115.685020408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-wtgrg" (UID: "9b6aeb7d-73be-42b9-93ef-72be8a26f4f7") : secret "plugin-serving-cert" not found Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.337282 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.357855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bgkk\" (UniqueName: \"kubernetes.io/projected/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-kube-api-access-2bgkk\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.438936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439371 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw5f7\" (UniqueName: \"kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439475 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.439524 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.441785 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.442004 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.443665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.443844 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.446198 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.455449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.459857 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw5f7\" (UniqueName: \"kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7\") pod \"console-5b76d796f4-ht42f\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.619559 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.642466 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.645624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a5ae9a03-b125-4072-9ec8-fddd19b7002e-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zpbgp\" (UID: \"a5ae9a03-b125-4072-9ec8-fddd19b7002e\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.758566 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d"] Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.787309 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.837287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-69n76" event={"ID":"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13","Type":"ContainerStarted","Data":"cc3fd76debf217e10157d38e8c84002b9d4405da20490bf78ea68b18bcea8d0e"} Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.840441 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" event={"ID":"514e58bd-bd21-45bf-8543-90439bf1c630","Type":"ContainerStarted","Data":"f21294f3b4e724d7beb7e34387d86f579ea92ac576edd4b2d097b602dca15d63"} Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.846195 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.850314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6aeb7d-73be-42b9-93ef-72be8a26f4f7-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-wtgrg\" (UID: \"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:00 crc kubenswrapper[4835]: I0319 09:41:00.966469 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.091706 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:41:01 crc kubenswrapper[4835]: W0319 09:41:01.104617 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod590e8c8f_f969_46f8_9b98_7247dd0b2601.slice/crio-59dec698dc06d39d8847df9c271311c59797a54403b8b603cce64aeee037b67a WatchSource:0}: Error finding container 59dec698dc06d39d8847df9c271311c59797a54403b8b603cce64aeee037b67a: Status 404 returned error can't find the container with id 59dec698dc06d39d8847df9c271311c59797a54403b8b603cce64aeee037b67a Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.181317 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp"] Mar 19 09:41:01 crc kubenswrapper[4835]: W0319 09:41:01.187220 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5ae9a03_b125_4072_9ec8_fddd19b7002e.slice/crio-3f6f1f7410d22195151cc56bb9d61fe37d7c9065aead796288027e992b6bd5b5 WatchSource:0}: Error finding container 3f6f1f7410d22195151cc56bb9d61fe37d7c9065aead796288027e992b6bd5b5: Status 404 returned error can't find the container with id 3f6f1f7410d22195151cc56bb9d61fe37d7c9065aead796288027e992b6bd5b5 Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.391445 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg"] Mar 19 09:41:01 crc kubenswrapper[4835]: W0319 09:41:01.404883 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b6aeb7d_73be_42b9_93ef_72be8a26f4f7.slice/crio-e0e27d0686f0d6acd5e8a0f1296cc3c90df8aca26d19ece4bb732e2815d1395f WatchSource:0}: Error finding container e0e27d0686f0d6acd5e8a0f1296cc3c90df8aca26d19ece4bb732e2815d1395f: Status 404 returned error can't find the container with id e0e27d0686f0d6acd5e8a0f1296cc3c90df8aca26d19ece4bb732e2815d1395f Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.848456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b76d796f4-ht42f" event={"ID":"590e8c8f-f969-46f8-9b98-7247dd0b2601","Type":"ContainerStarted","Data":"a2b6e706ebbd2b4ed42bbc6ea9b0c06261bd4a4e924f30a3f3193061b774e635"} Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.848501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b76d796f4-ht42f" event={"ID":"590e8c8f-f969-46f8-9b98-7247dd0b2601","Type":"ContainerStarted","Data":"59dec698dc06d39d8847df9c271311c59797a54403b8b603cce64aeee037b67a"} Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.851442 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" event={"ID":"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7","Type":"ContainerStarted","Data":"e0e27d0686f0d6acd5e8a0f1296cc3c90df8aca26d19ece4bb732e2815d1395f"} Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.852630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" event={"ID":"a5ae9a03-b125-4072-9ec8-fddd19b7002e","Type":"ContainerStarted","Data":"3f6f1f7410d22195151cc56bb9d61fe37d7c9065aead796288027e992b6bd5b5"} Mar 19 09:41:01 crc kubenswrapper[4835]: I0319 09:41:01.870787 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b76d796f4-ht42f" podStartSLOduration=1.870768465 podStartE2EDuration="1.870768465s" podCreationTimestamp="2026-03-19 09:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:41:01.869995723 +0000 UTC m=+1116.718594330" watchObservedRunningTime="2026-03-19 09:41:01.870768465 +0000 UTC m=+1116.719367052" Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.868905 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-69n76" event={"ID":"86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13","Type":"ContainerStarted","Data":"3c141bf9c11ac75d078cf4f2100eb54b3e16c3580f223efcd68b5be50de85464"} Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.869390 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.872175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" event={"ID":"514e58bd-bd21-45bf-8543-90439bf1c630","Type":"ContainerStarted","Data":"e8c00ede1f67441523137a54fd29a4ea84b131ffd3012939f5c7ff02accb07f3"} Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.874234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" event={"ID":"a5ae9a03-b125-4072-9ec8-fddd19b7002e","Type":"ContainerStarted","Data":"45c14915363e7401db5e0911bcde582a3453755e8661ec1bf58884c1107129a5"} Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.874389 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.896414 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-69n76" podStartSLOduration=1.948858832 podStartE2EDuration="4.896394499s" podCreationTimestamp="2026-03-19 09:40:59 +0000 UTC" firstStartedPulling="2026-03-19 09:41:00.295687337 +0000 UTC m=+1115.144285924" lastFinishedPulling="2026-03-19 09:41:03.243222994 +0000 UTC m=+1118.091821591" observedRunningTime="2026-03-19 09:41:03.893893161 +0000 UTC m=+1118.742491768" watchObservedRunningTime="2026-03-19 09:41:03.896394499 +0000 UTC m=+1118.744993086" Mar 19 09:41:03 crc kubenswrapper[4835]: I0319 09:41:03.910449 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podStartSLOduration=2.880857834 podStartE2EDuration="4.910434715s" podCreationTimestamp="2026-03-19 09:40:59 +0000 UTC" firstStartedPulling="2026-03-19 09:41:01.189779814 +0000 UTC m=+1116.038378401" lastFinishedPulling="2026-03-19 09:41:03.219356695 +0000 UTC m=+1118.067955282" observedRunningTime="2026-03-19 09:41:03.909652034 +0000 UTC m=+1118.758250621" watchObservedRunningTime="2026-03-19 09:41:03.910434715 +0000 UTC m=+1118.759033302" Mar 19 09:41:04 crc kubenswrapper[4835]: I0319 09:41:04.881266 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" event={"ID":"9b6aeb7d-73be-42b9-93ef-72be8a26f4f7","Type":"ContainerStarted","Data":"597efd84e45aefcde3b0f08e82102cc8ab5b50f3e76cda0765737a7b5a0dd9f8"} Mar 19 09:41:04 crc kubenswrapper[4835]: I0319 09:41:04.895317 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-wtgrg" podStartSLOduration=2.029533954 podStartE2EDuration="4.895298467s" podCreationTimestamp="2026-03-19 09:41:00 +0000 UTC" firstStartedPulling="2026-03-19 09:41:01.406260733 +0000 UTC m=+1116.254859320" lastFinishedPulling="2026-03-19 09:41:04.272025246 +0000 UTC m=+1119.120623833" observedRunningTime="2026-03-19 09:41:04.894842645 +0000 UTC m=+1119.743441252" watchObservedRunningTime="2026-03-19 09:41:04.895298467 +0000 UTC m=+1119.743897054" Mar 19 09:41:06 crc kubenswrapper[4835]: I0319 09:41:06.900686 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" event={"ID":"514e58bd-bd21-45bf-8543-90439bf1c630","Type":"ContainerStarted","Data":"77eb3e4dc28dd9045a2c808b55db016bb14e4303c0056ea841c099aaebd15a0e"} Mar 19 09:41:06 crc kubenswrapper[4835]: I0319 09:41:06.920467 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-8gq4d" podStartSLOduration=2.705812598 podStartE2EDuration="7.920446148s" podCreationTimestamp="2026-03-19 09:40:59 +0000 UTC" firstStartedPulling="2026-03-19 09:41:00.775175871 +0000 UTC m=+1115.623774458" lastFinishedPulling="2026-03-19 09:41:05.989809421 +0000 UTC m=+1120.838408008" observedRunningTime="2026-03-19 09:41:06.916805541 +0000 UTC m=+1121.765404148" watchObservedRunningTime="2026-03-19 09:41:06.920446148 +0000 UTC m=+1121.769044735" Mar 19 09:41:10 crc kubenswrapper[4835]: I0319 09:41:10.228045 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 09:41:10 crc kubenswrapper[4835]: I0319 09:41:10.620284 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:10 crc kubenswrapper[4835]: I0319 09:41:10.620703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:10 crc kubenswrapper[4835]: I0319 09:41:10.625025 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:10 crc kubenswrapper[4835]: I0319 09:41:10.949162 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:41:11 crc kubenswrapper[4835]: I0319 09:41:11.042022 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:41:20 crc kubenswrapper[4835]: I0319 09:41:20.793854 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.090898 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7d449f8d68-6snn9" podUID="c12895ad-7a70-45e4-90ac-100694d94ca5" containerName="console" containerID="cri-o://5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c" gracePeriod=15 Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.515985 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d449f8d68-6snn9_c12895ad-7a70-45e4-90ac-100694d94ca5/console/0.log" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.516328 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566125 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566224 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566267 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566493 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566513 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqkkx\" (UniqueName: \"kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566575 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.566606 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca\") pod \"c12895ad-7a70-45e4-90ac-100694d94ca5\" (UID: \"c12895ad-7a70-45e4-90ac-100694d94ca5\") " Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.567409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config" (OuterVolumeSpecName: "console-config") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.567442 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca" (OuterVolumeSpecName: "service-ca") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.567546 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.567540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.572110 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.572168 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx" (OuterVolumeSpecName: "kube-api-access-sqkkx") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "kube-api-access-sqkkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.572282 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c12895ad-7a70-45e4-90ac-100694d94ca5" (UID: "c12895ad-7a70-45e4-90ac-100694d94ca5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.668755 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669112 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669124 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12895ad-7a70-45e4-90ac-100694d94ca5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669136 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqkkx\" (UniqueName: \"kubernetes.io/projected/c12895ad-7a70-45e4-90ac-100694d94ca5-kube-api-access-sqkkx\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669149 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669162 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:36 crc kubenswrapper[4835]: I0319 09:41:36.669172 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12895ad-7a70-45e4-90ac-100694d94ca5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140118 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d449f8d68-6snn9_c12895ad-7a70-45e4-90ac-100694d94ca5/console/0.log" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140196 4835 generic.go:334] "Generic (PLEG): container finished" podID="c12895ad-7a70-45e4-90ac-100694d94ca5" containerID="5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c" exitCode=2 Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d449f8d68-6snn9" event={"ID":"c12895ad-7a70-45e4-90ac-100694d94ca5","Type":"ContainerDied","Data":"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c"} Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140260 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d449f8d68-6snn9" event={"ID":"c12895ad-7a70-45e4-90ac-100694d94ca5","Type":"ContainerDied","Data":"bfd9ffda07031c82e4818102ba2c3fefa02e1621f3309346fc6825640d3e4521"} Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140269 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d449f8d68-6snn9" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.140279 4835 scope.go:117] "RemoveContainer" containerID="5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.164321 4835 scope.go:117] "RemoveContainer" containerID="5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c" Mar 19 09:41:37 crc kubenswrapper[4835]: E0319 09:41:37.164579 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c\": container with ID starting with 5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c not found: ID does not exist" containerID="5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.164613 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c"} err="failed to get container status \"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c\": rpc error: code = NotFound desc = could not find container \"5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c\": container with ID starting with 5b5d79c0b8fa080fcd1af690f6da33fa9dfc0b14e3a9dfe7b4777bf9a321c43c not found: ID does not exist" Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.184424 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:41:37 crc kubenswrapper[4835]: I0319 09:41:37.205508 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d449f8d68-6snn9"] Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.415346 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12895ad-7a70-45e4-90ac-100694d94ca5" path="/var/lib/kubelet/pods/c12895ad-7a70-45e4-90ac-100694d94ca5/volumes" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.463490 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r"] Mar 19 09:41:38 crc kubenswrapper[4835]: E0319 09:41:38.463991 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c12895ad-7a70-45e4-90ac-100694d94ca5" containerName="console" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.464021 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12895ad-7a70-45e4-90ac-100694d94ca5" containerName="console" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.464266 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c12895ad-7a70-45e4-90ac-100694d94ca5" containerName="console" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.466004 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.468048 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.492110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r"] Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.495562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6sqf\" (UniqueName: \"kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.495678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.496914 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.598323 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.598446 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6sqf\" (UniqueName: \"kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.598492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.599105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.599313 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.620512 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6sqf\" (UniqueName: \"kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:38 crc kubenswrapper[4835]: I0319 09:41:38.785185 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:39 crc kubenswrapper[4835]: I0319 09:41:39.247710 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r"] Mar 19 09:41:40 crc kubenswrapper[4835]: I0319 09:41:40.165510 4835 generic.go:334] "Generic (PLEG): container finished" podID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerID="a81c1627fb03e00ca5b0a6e51eb217754bf562fcb52ee4f6d60a23faff6935ca" exitCode=0 Mar 19 09:41:40 crc kubenswrapper[4835]: I0319 09:41:40.165572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" event={"ID":"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac","Type":"ContainerDied","Data":"a81c1627fb03e00ca5b0a6e51eb217754bf562fcb52ee4f6d60a23faff6935ca"} Mar 19 09:41:40 crc kubenswrapper[4835]: I0319 09:41:40.165874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" event={"ID":"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac","Type":"ContainerStarted","Data":"6ea53262fc19fb327ce6f4d44e875c8781eee860cf5063f9f41e01d1ae1cb9f2"} Mar 19 09:41:40 crc kubenswrapper[4835]: I0319 09:41:40.167550 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:41:42 crc kubenswrapper[4835]: I0319 09:41:42.180382 4835 generic.go:334] "Generic (PLEG): container finished" podID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerID="2efc14232b65f0bda365b0882d6abc34f74f6307825c82ff59d177947409dd04" exitCode=0 Mar 19 09:41:42 crc kubenswrapper[4835]: I0319 09:41:42.181030 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" event={"ID":"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac","Type":"ContainerDied","Data":"2efc14232b65f0bda365b0882d6abc34f74f6307825c82ff59d177947409dd04"} Mar 19 09:41:43 crc kubenswrapper[4835]: I0319 09:41:43.192560 4835 generic.go:334] "Generic (PLEG): container finished" podID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerID="764c38bcf1ed316a1cf089f77964cb200b91a788e39283236631f10e7b17bd44" exitCode=0 Mar 19 09:41:43 crc kubenswrapper[4835]: I0319 09:41:43.192802 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" event={"ID":"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac","Type":"ContainerDied","Data":"764c38bcf1ed316a1cf089f77964cb200b91a788e39283236631f10e7b17bd44"} Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.511995 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.607189 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util\") pod \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.607267 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6sqf\" (UniqueName: \"kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf\") pod \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.607375 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle\") pod \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\" (UID: \"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac\") " Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.608203 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle" (OuterVolumeSpecName: "bundle") pod "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" (UID: "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.613363 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf" (OuterVolumeSpecName: "kube-api-access-b6sqf") pod "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" (UID: "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac"). InnerVolumeSpecName "kube-api-access-b6sqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.621019 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util" (OuterVolumeSpecName: "util") pod "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" (UID: "f580e4fd-993c-4b3a-a95f-7b1e66ca13ac"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.709248 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.709293 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6sqf\" (UniqueName: \"kubernetes.io/projected/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-kube-api-access-b6sqf\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:44 crc kubenswrapper[4835]: I0319 09:41:44.709313 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f580e4fd-993c-4b3a-a95f-7b1e66ca13ac-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:41:45 crc kubenswrapper[4835]: I0319 09:41:45.211708 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" event={"ID":"f580e4fd-993c-4b3a-a95f-7b1e66ca13ac","Type":"ContainerDied","Data":"6ea53262fc19fb327ce6f4d44e875c8781eee860cf5063f9f41e01d1ae1cb9f2"} Mar 19 09:41:45 crc kubenswrapper[4835]: I0319 09:41:45.211766 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ea53262fc19fb327ce6f4d44e875c8781eee860cf5063f9f41e01d1ae1cb9f2" Mar 19 09:41:45 crc kubenswrapper[4835]: I0319 09:41:45.211889 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.315715 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c8677745-k75vc"] Mar 19 09:41:52 crc kubenswrapper[4835]: E0319 09:41:52.316423 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="extract" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.316436 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="extract" Mar 19 09:41:52 crc kubenswrapper[4835]: E0319 09:41:52.316449 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="util" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.316455 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="util" Mar 19 09:41:52 crc kubenswrapper[4835]: E0319 09:41:52.316467 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="pull" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.316474 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="pull" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.316615 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f580e4fd-993c-4b3a-a95f-7b1e66ca13ac" containerName="extract" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.317100 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.327147 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.327342 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.327430 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.327481 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.327593 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-jqfj9" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.341676 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c8677745-k75vc"] Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.380903 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rf8\" (UniqueName: \"kubernetes.io/projected/2a19bff3-1be0-44d7-b625-df3e46afa290-kube-api-access-94rf8\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.380944 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-apiservice-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.381028 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-webhook-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.482164 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-webhook-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.482319 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94rf8\" (UniqueName: \"kubernetes.io/projected/2a19bff3-1be0-44d7-b625-df3e46afa290-kube-api-access-94rf8\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.482342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-apiservice-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.494862 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-apiservice-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.504470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a19bff3-1be0-44d7-b625-df3e46afa290-webhook-cert\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.509923 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rf8\" (UniqueName: \"kubernetes.io/projected/2a19bff3-1be0-44d7-b625-df3e46afa290-kube-api-access-94rf8\") pod \"metallb-operator-controller-manager-85c8677745-k75vc\" (UID: \"2a19bff3-1be0-44d7-b625-df3e46afa290\") " pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.613375 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf"] Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.614382 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.622040 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.622051 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.622356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-sfdqp" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.638881 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.645858 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf"] Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.688056 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-apiservice-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.688360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s764r\" (UniqueName: \"kubernetes.io/projected/817b88ed-bcd4-4702-bd72-7d04de779c86-kube-api-access-s764r\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.689135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-webhook-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.790223 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-apiservice-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.790296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s764r\" (UniqueName: \"kubernetes.io/projected/817b88ed-bcd4-4702-bd72-7d04de779c86-kube-api-access-s764r\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.790371 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-webhook-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.794924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-apiservice-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.797674 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/817b88ed-bcd4-4702-bd72-7d04de779c86-webhook-cert\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.821182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s764r\" (UniqueName: \"kubernetes.io/projected/817b88ed-bcd4-4702-bd72-7d04de779c86-kube-api-access-s764r\") pod \"metallb-operator-webhook-server-6889f84cf4-bbfgf\" (UID: \"817b88ed-bcd4-4702-bd72-7d04de779c86\") " pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:52 crc kubenswrapper[4835]: I0319 09:41:52.934279 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:53 crc kubenswrapper[4835]: I0319 09:41:53.175312 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85c8677745-k75vc"] Mar 19 09:41:53 crc kubenswrapper[4835]: I0319 09:41:53.278692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" event={"ID":"2a19bff3-1be0-44d7-b625-df3e46afa290","Type":"ContainerStarted","Data":"0337dfe9864d0635c26891fd50111c510c081a5f71f4ae9c4163e07c207e5ae7"} Mar 19 09:41:53 crc kubenswrapper[4835]: I0319 09:41:53.459637 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf"] Mar 19 09:41:53 crc kubenswrapper[4835]: W0319 09:41:53.462163 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817b88ed_bcd4_4702_bd72_7d04de779c86.slice/crio-0740cabce1dab73fecf7ef9f14c4a5ce1239bb095e59ae0c37c37a11de9d873e WatchSource:0}: Error finding container 0740cabce1dab73fecf7ef9f14c4a5ce1239bb095e59ae0c37c37a11de9d873e: Status 404 returned error can't find the container with id 0740cabce1dab73fecf7ef9f14c4a5ce1239bb095e59ae0c37c37a11de9d873e Mar 19 09:41:54 crc kubenswrapper[4835]: I0319 09:41:54.286548 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" event={"ID":"817b88ed-bcd4-4702-bd72-7d04de779c86","Type":"ContainerStarted","Data":"0740cabce1dab73fecf7ef9f14c4a5ce1239bb095e59ae0c37c37a11de9d873e"} Mar 19 09:41:57 crc kubenswrapper[4835]: I0319 09:41:57.305661 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" event={"ID":"2a19bff3-1be0-44d7-b625-df3e46afa290","Type":"ContainerStarted","Data":"f8155537eafb005fd36b6887c9322385516342656805537f0f09c15d5982a5cf"} Mar 19 09:41:57 crc kubenswrapper[4835]: I0319 09:41:57.307444 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:41:59 crc kubenswrapper[4835]: I0319 09:41:59.337552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" event={"ID":"817b88ed-bcd4-4702-bd72-7d04de779c86","Type":"ContainerStarted","Data":"fe4f6e08b22834e89ad25898df6beb0fbd7488400ffad6d20ec190b0bc72562d"} Mar 19 09:41:59 crc kubenswrapper[4835]: I0319 09:41:59.338196 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:41:59 crc kubenswrapper[4835]: I0319 09:41:59.367830 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" podStartSLOduration=4.072626571 podStartE2EDuration="7.367813611s" podCreationTimestamp="2026-03-19 09:41:52 +0000 UTC" firstStartedPulling="2026-03-19 09:41:53.186888473 +0000 UTC m=+1168.035487060" lastFinishedPulling="2026-03-19 09:41:56.482075513 +0000 UTC m=+1171.330674100" observedRunningTime="2026-03-19 09:41:57.335908564 +0000 UTC m=+1172.184507151" watchObservedRunningTime="2026-03-19 09:41:59.367813611 +0000 UTC m=+1174.216412198" Mar 19 09:41:59 crc kubenswrapper[4835]: I0319 09:41:59.370819 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podStartSLOduration=2.463785551 podStartE2EDuration="7.370806471s" podCreationTimestamp="2026-03-19 09:41:52 +0000 UTC" firstStartedPulling="2026-03-19 09:41:53.465083961 +0000 UTC m=+1168.313682548" lastFinishedPulling="2026-03-19 09:41:58.372104881 +0000 UTC m=+1173.220703468" observedRunningTime="2026-03-19 09:41:59.365144349 +0000 UTC m=+1174.213742936" watchObservedRunningTime="2026-03-19 09:41:59.370806471 +0000 UTC m=+1174.219405048" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.217726 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565222-bxrtd"] Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.218921 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.221290 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.221505 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.221507 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.226364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565222-bxrtd"] Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.329273 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkrwl\" (UniqueName: \"kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl\") pod \"auto-csr-approver-29565222-bxrtd\" (UID: \"151e4145-cf02-4148-ad6e-a7cebba6e4f0\") " pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.431794 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkrwl\" (UniqueName: \"kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl\") pod \"auto-csr-approver-29565222-bxrtd\" (UID: \"151e4145-cf02-4148-ad6e-a7cebba6e4f0\") " pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.469595 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkrwl\" (UniqueName: \"kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl\") pod \"auto-csr-approver-29565222-bxrtd\" (UID: \"151e4145-cf02-4148-ad6e-a7cebba6e4f0\") " pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:00 crc kubenswrapper[4835]: I0319 09:42:00.555371 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:01 crc kubenswrapper[4835]: I0319 09:42:01.022852 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565222-bxrtd"] Mar 19 09:42:01 crc kubenswrapper[4835]: I0319 09:42:01.350930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" event={"ID":"151e4145-cf02-4148-ad6e-a7cebba6e4f0","Type":"ContainerStarted","Data":"b06a3d61c64884260210c340aa6987963fe024b07d2b44d613be986dd4a58eb2"} Mar 19 09:42:03 crc kubenswrapper[4835]: I0319 09:42:03.366143 4835 generic.go:334] "Generic (PLEG): container finished" podID="151e4145-cf02-4148-ad6e-a7cebba6e4f0" containerID="2e4dbacc1b3de03cda9ae2dac8484d6d3e2da91b2b19ea75d9f2dfb06e36203e" exitCode=0 Mar 19 09:42:03 crc kubenswrapper[4835]: I0319 09:42:03.366700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" event={"ID":"151e4145-cf02-4148-ad6e-a7cebba6e4f0","Type":"ContainerDied","Data":"2e4dbacc1b3de03cda9ae2dac8484d6d3e2da91b2b19ea75d9f2dfb06e36203e"} Mar 19 09:42:04 crc kubenswrapper[4835]: I0319 09:42:04.812572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:04 crc kubenswrapper[4835]: I0319 09:42:04.907264 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkrwl\" (UniqueName: \"kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl\") pod \"151e4145-cf02-4148-ad6e-a7cebba6e4f0\" (UID: \"151e4145-cf02-4148-ad6e-a7cebba6e4f0\") " Mar 19 09:42:04 crc kubenswrapper[4835]: I0319 09:42:04.913999 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl" (OuterVolumeSpecName: "kube-api-access-lkrwl") pod "151e4145-cf02-4148-ad6e-a7cebba6e4f0" (UID: "151e4145-cf02-4148-ad6e-a7cebba6e4f0"). InnerVolumeSpecName "kube-api-access-lkrwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.009257 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkrwl\" (UniqueName: \"kubernetes.io/projected/151e4145-cf02-4148-ad6e-a7cebba6e4f0-kube-api-access-lkrwl\") on node \"crc\" DevicePath \"\"" Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.381198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" event={"ID":"151e4145-cf02-4148-ad6e-a7cebba6e4f0","Type":"ContainerDied","Data":"b06a3d61c64884260210c340aa6987963fe024b07d2b44d613be986dd4a58eb2"} Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.381538 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06a3d61c64884260210c340aa6987963fe024b07d2b44d613be986dd4a58eb2" Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.381240 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565222-bxrtd" Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.869975 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565216-kbj6t"] Mar 19 09:42:05 crc kubenswrapper[4835]: I0319 09:42:05.884529 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565216-kbj6t"] Mar 19 09:42:06 crc kubenswrapper[4835]: I0319 09:42:06.413846 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fee259-324e-4077-a06f-2186e9f3d832" path="/var/lib/kubelet/pods/b0fee259-324e-4077-a06f-2186e9f3d832/volumes" Mar 19 09:42:06 crc kubenswrapper[4835]: I0319 09:42:06.422691 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:42:06 crc kubenswrapper[4835]: I0319 09:42:06.422810 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:42:12 crc kubenswrapper[4835]: I0319 09:42:12.938824 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 09:42:27 crc kubenswrapper[4835]: I0319 09:42:27.495694 4835 scope.go:117] "RemoveContainer" containerID="dff147af2c4d281fdd971dd1c755fe6a25fc191dac207d6980b0aafc1c7c7eec" Mar 19 09:42:32 crc kubenswrapper[4835]: I0319 09:42:32.641723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.541931 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-44rxf"] Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.542656 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151e4145-cf02-4148-ad6e-a7cebba6e4f0" containerName="oc" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.542680 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="151e4145-cf02-4148-ad6e-a7cebba6e4f0" containerName="oc" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.542886 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="151e4145-cf02-4148-ad6e-a7cebba6e4f0" containerName="oc" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.546078 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.547967 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-6bjnf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.548254 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.548374 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.553193 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm"] Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.554726 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.557435 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.560700 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm"] Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-metrics\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617800 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-conf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-reloader\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617891 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7lv\" (UniqueName: \"kubernetes.io/projected/9b388096-29e9-4547-b43a-ec3a7935572b-kube-api-access-sk7lv\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617922 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617967 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8173c77e-48ec-44fc-9be7-67381528f78a-frr-startup\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.617988 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.618018 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-sockets\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.618046 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28bf\" (UniqueName: \"kubernetes.io/projected/8173c77e-48ec-44fc-9be7-67381528f78a-kube-api-access-h28bf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.632103 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zfhzl"] Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.633617 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.636437 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-d7txm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.636622 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.636904 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.637096 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.645421 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-47sfq"] Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.657430 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.684134 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.707794 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-47sfq"] Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-sockets\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720654 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28bf\" (UniqueName: \"kubernetes.io/projected/8173c77e-48ec-44fc-9be7-67381528f78a-kube-api-access-h28bf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-cert\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-metrics\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-conf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720881 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pfw\" (UniqueName: \"kubernetes.io/projected/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-kube-api-access-g4pfw\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720917 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-reloader\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7lv\" (UniqueName: \"kubernetes.io/projected/9b388096-29e9-4547-b43a-ec3a7935572b-kube-api-access-sk7lv\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.720973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721011 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721040 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f751b70-2b62-41da-b71c-ce7039840e3e-metallb-excludel2\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721066 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8173c77e-48ec-44fc-9be7-67381528f78a-frr-startup\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.721141 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmwvf\" (UniqueName: \"kubernetes.io/projected/3f751b70-2b62-41da-b71c-ce7039840e3e-kube-api-access-zmwvf\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.722030 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-sockets\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.722526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-metrics\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.722759 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-frr-conf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.722999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8173c77e-48ec-44fc-9be7-67381528f78a-reloader\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.723286 4835 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.723340 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs podName:8173c77e-48ec-44fc-9be7-67381528f78a nodeName:}" failed. No retries permitted until 2026-03-19 09:42:34.223321977 +0000 UTC m=+1209.071920574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs") pod "frr-k8s-44rxf" (UID: "8173c77e-48ec-44fc-9be7-67381528f78a") : secret "frr-k8s-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.724151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8173c77e-48ec-44fc-9be7-67381528f78a-frr-startup\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.724227 4835 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.724263 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert podName:9b388096-29e9-4547-b43a-ec3a7935572b nodeName:}" failed. No retries permitted until 2026-03-19 09:42:34.224253181 +0000 UTC m=+1209.072851768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert") pod "frr-k8s-webhook-server-bcc4b6f68-g4bvm" (UID: "9b388096-29e9-4547-b43a-ec3a7935572b") : secret "frr-k8s-webhook-server-cert" not found Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.759788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28bf\" (UniqueName: \"kubernetes.io/projected/8173c77e-48ec-44fc-9be7-67381528f78a-kube-api-access-h28bf\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.779580 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7lv\" (UniqueName: \"kubernetes.io/projected/9b388096-29e9-4547-b43a-ec3a7935572b-kube-api-access-sk7lv\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.822755 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmwvf\" (UniqueName: \"kubernetes.io/projected/3f751b70-2b62-41da-b71c-ce7039840e3e-kube-api-access-zmwvf\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.822835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-cert\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.822921 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.822969 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pfw\" (UniqueName: \"kubernetes.io/projected/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-kube-api-access-g4pfw\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.823013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.823053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f751b70-2b62-41da-b71c-ce7039840e3e-metallb-excludel2\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.823083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823244 4835 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823309 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs podName:3f751b70-2b62-41da-b71c-ce7039840e3e nodeName:}" failed. No retries permitted until 2026-03-19 09:42:34.323291931 +0000 UTC m=+1209.171890518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs") pod "speaker-zfhzl" (UID: "3f751b70-2b62-41da-b71c-ce7039840e3e") : secret "speaker-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823469 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823529 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist podName:3f751b70-2b62-41da-b71c-ce7039840e3e nodeName:}" failed. No retries permitted until 2026-03-19 09:42:34.323511717 +0000 UTC m=+1209.172110394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist") pod "speaker-zfhzl" (UID: "3f751b70-2b62-41da-b71c-ce7039840e3e") : secret "metallb-memberlist" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823555 4835 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: E0319 09:42:33.823589 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs podName:3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6 nodeName:}" failed. No retries permitted until 2026-03-19 09:42:34.323577858 +0000 UTC m=+1209.172176545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs") pod "controller-7bb4cc7c98-47sfq" (UID: "3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6") : secret "controller-certs-secret" not found Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.824309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3f751b70-2b62-41da-b71c-ce7039840e3e-metallb-excludel2\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.826014 4835 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.839788 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-cert\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.855979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pfw\" (UniqueName: \"kubernetes.io/projected/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-kube-api-access-g4pfw\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:33 crc kubenswrapper[4835]: I0319 09:42:33.856522 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmwvf\" (UniqueName: \"kubernetes.io/projected/3f751b70-2b62-41da-b71c-ce7039840e3e-kube-api-access-zmwvf\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.231125 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.231976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.234665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b388096-29e9-4547-b43a-ec3a7935572b-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-g4bvm\" (UID: \"9b388096-29e9-4547-b43a-ec3a7935572b\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.235064 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8173c77e-48ec-44fc-9be7-67381528f78a-metrics-certs\") pod \"frr-k8s-44rxf\" (UID: \"8173c77e-48ec-44fc-9be7-67381528f78a\") " pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.333265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.333356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:34 crc kubenswrapper[4835]: E0319 09:42:34.333461 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.333512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:34 crc kubenswrapper[4835]: E0319 09:42:34.333535 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist podName:3f751b70-2b62-41da-b71c-ce7039840e3e nodeName:}" failed. No retries permitted until 2026-03-19 09:42:35.333516198 +0000 UTC m=+1210.182114785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist") pod "speaker-zfhzl" (UID: "3f751b70-2b62-41da-b71c-ce7039840e3e") : secret "metallb-memberlist" not found Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.336376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6-metrics-certs\") pod \"controller-7bb4cc7c98-47sfq\" (UID: \"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6\") " pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.336535 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-metrics-certs\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.473765 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.484160 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:34 crc kubenswrapper[4835]: I0319 09:42:34.588790 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:35 crc kubenswrapper[4835]: I0319 09:42:35.020767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"f6d835eaaa6a1147314936bb49e3ba701f1a249da391c6a2bac9fc9297dd25b4"} Mar 19 09:42:35 crc kubenswrapper[4835]: W0319 09:42:35.042913 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b388096_29e9_4547_b43a_ec3a7935572b.slice/crio-7daa28313c2cb18eaf7a9cbb4951e4c2a451cf9d36884b49c189c0e4f4379e50 WatchSource:0}: Error finding container 7daa28313c2cb18eaf7a9cbb4951e4c2a451cf9d36884b49c189c0e4f4379e50: Status 404 returned error can't find the container with id 7daa28313c2cb18eaf7a9cbb4951e4c2a451cf9d36884b49c189c0e4f4379e50 Mar 19 09:42:35 crc kubenswrapper[4835]: I0319 09:42:35.051567 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm"] Mar 19 09:42:35 crc kubenswrapper[4835]: I0319 09:42:35.103256 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-47sfq"] Mar 19 09:42:35 crc kubenswrapper[4835]: W0319 09:42:35.110920 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d3ab7f2_b252_4ef7_8a11_5af68bf23fa6.slice/crio-a4beb7af00ad8c49215c64bf9daf5de98af8042d4b98fd25bb8cf7b6d9dd818e WatchSource:0}: Error finding container a4beb7af00ad8c49215c64bf9daf5de98af8042d4b98fd25bb8cf7b6d9dd818e: Status 404 returned error can't find the container with id a4beb7af00ad8c49215c64bf9daf5de98af8042d4b98fd25bb8cf7b6d9dd818e Mar 19 09:42:35 crc kubenswrapper[4835]: I0319 09:42:35.363057 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:35 crc kubenswrapper[4835]: E0319 09:42:35.363271 4835 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 09:42:35 crc kubenswrapper[4835]: E0319 09:42:35.363361 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist podName:3f751b70-2b62-41da-b71c-ce7039840e3e nodeName:}" failed. No retries permitted until 2026-03-19 09:42:37.363340033 +0000 UTC m=+1212.211938640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist") pod "speaker-zfhzl" (UID: "3f751b70-2b62-41da-b71c-ce7039840e3e") : secret "metallb-memberlist" not found Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.028828 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" event={"ID":"9b388096-29e9-4547-b43a-ec3a7935572b","Type":"ContainerStarted","Data":"7daa28313c2cb18eaf7a9cbb4951e4c2a451cf9d36884b49c189c0e4f4379e50"} Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.031409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-47sfq" event={"ID":"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6","Type":"ContainerStarted","Data":"3f75125d5d63b8b5b7dc25723d6a32088829b5318c6966d1d1833497ca98094d"} Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.031452 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-47sfq" event={"ID":"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6","Type":"ContainerStarted","Data":"880ef1fcd5f37f9adb7482d398f395277ded12a185347866d30983d2f5fa0876"} Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.031463 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-47sfq" event={"ID":"3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6","Type":"ContainerStarted","Data":"a4beb7af00ad8c49215c64bf9daf5de98af8042d4b98fd25bb8cf7b6d9dd818e"} Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.031499 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.421607 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.421980 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:42:36 crc kubenswrapper[4835]: I0319 09:42:36.429499 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-47sfq" podStartSLOduration=3.429484624 podStartE2EDuration="3.429484624s" podCreationTimestamp="2026-03-19 09:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:42:36.063983842 +0000 UTC m=+1210.912582439" watchObservedRunningTime="2026-03-19 09:42:36.429484624 +0000 UTC m=+1211.278083201" Mar 19 09:42:37 crc kubenswrapper[4835]: I0319 09:42:37.395845 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:37 crc kubenswrapper[4835]: I0319 09:42:37.404338 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3f751b70-2b62-41da-b71c-ce7039840e3e-memberlist\") pod \"speaker-zfhzl\" (UID: \"3f751b70-2b62-41da-b71c-ce7039840e3e\") " pod="metallb-system/speaker-zfhzl" Mar 19 09:42:37 crc kubenswrapper[4835]: I0319 09:42:37.554354 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zfhzl" Mar 19 09:42:38 crc kubenswrapper[4835]: I0319 09:42:38.049889 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfhzl" event={"ID":"3f751b70-2b62-41da-b71c-ce7039840e3e","Type":"ContainerStarted","Data":"02b71afa104a61ebdb0ad57a9c5673d7d6ca7bc8dfe732ff6362578808eb3fdc"} Mar 19 09:42:38 crc kubenswrapper[4835]: I0319 09:42:38.050268 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfhzl" event={"ID":"3f751b70-2b62-41da-b71c-ce7039840e3e","Type":"ContainerStarted","Data":"0be597a3604f07c769c4797cc38b46b9bc85c3328697400a16aae499272b72de"} Mar 19 09:42:39 crc kubenswrapper[4835]: I0319 09:42:39.063594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfhzl" event={"ID":"3f751b70-2b62-41da-b71c-ce7039840e3e","Type":"ContainerStarted","Data":"6ddf32aac838b0344bad557e0185d6acc9a38d92568bb276d3931ba051f639c6"} Mar 19 09:42:39 crc kubenswrapper[4835]: I0319 09:42:39.063769 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zfhzl" Mar 19 09:42:39 crc kubenswrapper[4835]: I0319 09:42:39.085126 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zfhzl" podStartSLOduration=6.085108745 podStartE2EDuration="6.085108745s" podCreationTimestamp="2026-03-19 09:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:42:39.079932416 +0000 UTC m=+1213.928531003" watchObservedRunningTime="2026-03-19 09:42:39.085108745 +0000 UTC m=+1213.933707332" Mar 19 09:42:43 crc kubenswrapper[4835]: I0319 09:42:43.095825 4835 generic.go:334] "Generic (PLEG): container finished" podID="8173c77e-48ec-44fc-9be7-67381528f78a" containerID="aed81f17091cd9b5080e71ce936500a0d2f8a8946c0c230a0d8fc2e5bbb6327b" exitCode=0 Mar 19 09:42:43 crc kubenswrapper[4835]: I0319 09:42:43.095913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerDied","Data":"aed81f17091cd9b5080e71ce936500a0d2f8a8946c0c230a0d8fc2e5bbb6327b"} Mar 19 09:42:43 crc kubenswrapper[4835]: I0319 09:42:43.099605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" event={"ID":"9b388096-29e9-4547-b43a-ec3a7935572b","Type":"ContainerStarted","Data":"1e0844d87c0ed3709d5ed5382474a85031288904d9517376a1f701a37ef59c93"} Mar 19 09:42:43 crc kubenswrapper[4835]: I0319 09:42:43.099940 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:43 crc kubenswrapper[4835]: I0319 09:42:43.147955 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podStartSLOduration=3.026726422 podStartE2EDuration="10.147936633s" podCreationTimestamp="2026-03-19 09:42:33 +0000 UTC" firstStartedPulling="2026-03-19 09:42:35.046286062 +0000 UTC m=+1209.894884649" lastFinishedPulling="2026-03-19 09:42:42.167496273 +0000 UTC m=+1217.016094860" observedRunningTime="2026-03-19 09:42:43.145645061 +0000 UTC m=+1217.994243678" watchObservedRunningTime="2026-03-19 09:42:43.147936633 +0000 UTC m=+1217.996535230" Mar 19 09:42:44 crc kubenswrapper[4835]: I0319 09:42:44.111168 4835 generic.go:334] "Generic (PLEG): container finished" podID="8173c77e-48ec-44fc-9be7-67381528f78a" containerID="e932344388bef5e6b2f66e5e56ce1cef3af09c1ff3aed6b3ada2af918e0bc7e8" exitCode=0 Mar 19 09:42:44 crc kubenswrapper[4835]: I0319 09:42:44.111261 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerDied","Data":"e932344388bef5e6b2f66e5e56ce1cef3af09c1ff3aed6b3ada2af918e0bc7e8"} Mar 19 09:42:45 crc kubenswrapper[4835]: I0319 09:42:45.121934 4835 generic.go:334] "Generic (PLEG): container finished" podID="8173c77e-48ec-44fc-9be7-67381528f78a" containerID="2b3b29911d662d0419e3071b87d85de4bd4e4d3c4fc1e94b62bcb77b448e58d9" exitCode=0 Mar 19 09:42:45 crc kubenswrapper[4835]: I0319 09:42:45.122009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerDied","Data":"2b3b29911d662d0419e3071b87d85de4bd4e4d3c4fc1e94b62bcb77b448e58d9"} Mar 19 09:42:46 crc kubenswrapper[4835]: I0319 09:42:46.132641 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"048df4394a7bd870241b4cfbb34b7a5157bd6d901b6de5ee305297b2fc07280a"} Mar 19 09:42:46 crc kubenswrapper[4835]: I0319 09:42:46.133290 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"2670324746b8a418232c8a70ba4a46337ae797081cfcda2a26c9811ff9b3b6ba"} Mar 19 09:42:46 crc kubenswrapper[4835]: I0319 09:42:46.133309 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"8b1c5153b72afbc84ef61d23ab3213c05aa5f50555c8f6ab7ab6289373a08e7f"} Mar 19 09:42:46 crc kubenswrapper[4835]: I0319 09:42:46.133324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"ae03816819220c6edb9f8e7928abc6ed18f2726bdbc02970897abc5dcd4c3ba3"} Mar 19 09:42:46 crc kubenswrapper[4835]: I0319 09:42:46.133335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"950062dd692baeb4800174478a0712b6b269d28daa10f23a95333aa34f7da980"} Mar 19 09:42:47 crc kubenswrapper[4835]: I0319 09:42:47.143374 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"acd3d19b39412e21364eaf879f5f92d89837bd7ebdc3258ae8893b2233d1ec60"} Mar 19 09:42:47 crc kubenswrapper[4835]: I0319 09:42:47.143869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:47 crc kubenswrapper[4835]: I0319 09:42:47.171981 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-44rxf" podStartSLOduration=6.67304598 podStartE2EDuration="14.171963231s" podCreationTimestamp="2026-03-19 09:42:33 +0000 UTC" firstStartedPulling="2026-03-19 09:42:34.659901519 +0000 UTC m=+1209.508500106" lastFinishedPulling="2026-03-19 09:42:42.15881877 +0000 UTC m=+1217.007417357" observedRunningTime="2026-03-19 09:42:47.171341634 +0000 UTC m=+1222.019940291" watchObservedRunningTime="2026-03-19 09:42:47.171963231 +0000 UTC m=+1222.020561818" Mar 19 09:42:47 crc kubenswrapper[4835]: I0319 09:42:47.558620 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zfhzl" Mar 19 09:42:49 crc kubenswrapper[4835]: I0319 09:42:49.474547 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:49 crc kubenswrapper[4835]: I0319 09:42:49.524718 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.517241 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.519231 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.521889 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nwg22" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.523795 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.536487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7fll\" (UniqueName: \"kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll\") pod \"openstack-operator-index-9rjkz\" (UID: \"d0de5d5a-7770-412f-848c-e9f60f8c7307\") " pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.538853 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.567490 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.638377 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7fll\" (UniqueName: \"kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll\") pod \"openstack-operator-index-9rjkz\" (UID: \"d0de5d5a-7770-412f-848c-e9f60f8c7307\") " pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.658801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7fll\" (UniqueName: \"kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll\") pod \"openstack-operator-index-9rjkz\" (UID: \"d0de5d5a-7770-412f-848c-e9f60f8c7307\") " pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:50 crc kubenswrapper[4835]: I0319 09:42:50.840676 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:51 crc kubenswrapper[4835]: I0319 09:42:51.266374 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:51 crc kubenswrapper[4835]: W0319 09:42:51.273429 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0de5d5a_7770_412f_848c_e9f60f8c7307.slice/crio-7f223a54e048e4b5b5f12e3139f6609ab2d7535aea153f46ac3345cd16c4f859 WatchSource:0}: Error finding container 7f223a54e048e4b5b5f12e3139f6609ab2d7535aea153f46ac3345cd16c4f859: Status 404 returned error can't find the container with id 7f223a54e048e4b5b5f12e3139f6609ab2d7535aea153f46ac3345cd16c4f859 Mar 19 09:42:52 crc kubenswrapper[4835]: I0319 09:42:52.193554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9rjkz" event={"ID":"d0de5d5a-7770-412f-848c-e9f60f8c7307","Type":"ContainerStarted","Data":"7f223a54e048e4b5b5f12e3139f6609ab2d7535aea153f46ac3345cd16c4f859"} Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.089460 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.504609 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7pjhg"] Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.505694 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.521152 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pjhg"] Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.601834 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbk48\" (UniqueName: \"kubernetes.io/projected/d95e8e04-5d17-45e5-aeb8-d6857bfc9d01-kube-api-access-mbk48\") pod \"openstack-operator-index-7pjhg\" (UID: \"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01\") " pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.703384 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbk48\" (UniqueName: \"kubernetes.io/projected/d95e8e04-5d17-45e5-aeb8-d6857bfc9d01-kube-api-access-mbk48\") pod \"openstack-operator-index-7pjhg\" (UID: \"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01\") " pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.721290 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbk48\" (UniqueName: \"kubernetes.io/projected/d95e8e04-5d17-45e5-aeb8-d6857bfc9d01-kube-api-access-mbk48\") pod \"openstack-operator-index-7pjhg\" (UID: \"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01\") " pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:42:53 crc kubenswrapper[4835]: I0319 09:42:53.830719 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:42:54 crc kubenswrapper[4835]: I0319 09:42:54.489762 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 09:42:54 crc kubenswrapper[4835]: I0319 09:42:54.592263 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-47sfq" Mar 19 09:42:55 crc kubenswrapper[4835]: I0319 09:42:55.469588 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7pjhg"] Mar 19 09:42:55 crc kubenswrapper[4835]: W0319 09:42:55.470628 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95e8e04_5d17_45e5_aeb8_d6857bfc9d01.slice/crio-aa77644bd1577b8c11203f6fec3549e58039680ab345e5364e830e0aef5f7a4d WatchSource:0}: Error finding container aa77644bd1577b8c11203f6fec3549e58039680ab345e5364e830e0aef5f7a4d: Status 404 returned error can't find the container with id aa77644bd1577b8c11203f6fec3549e58039680ab345e5364e830e0aef5f7a4d Mar 19 09:42:56 crc kubenswrapper[4835]: I0319 09:42:56.227041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9rjkz" event={"ID":"d0de5d5a-7770-412f-848c-e9f60f8c7307","Type":"ContainerStarted","Data":"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee"} Mar 19 09:42:56 crc kubenswrapper[4835]: I0319 09:42:56.227087 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9rjkz" podUID="d0de5d5a-7770-412f-848c-e9f60f8c7307" containerName="registry-server" containerID="cri-o://faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee" gracePeriod=2 Mar 19 09:42:56 crc kubenswrapper[4835]: I0319 09:42:56.229317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pjhg" event={"ID":"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01","Type":"ContainerStarted","Data":"d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85"} Mar 19 09:42:56 crc kubenswrapper[4835]: I0319 09:42:56.229366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pjhg" event={"ID":"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01","Type":"ContainerStarted","Data":"aa77644bd1577b8c11203f6fec3549e58039680ab345e5364e830e0aef5f7a4d"} Mar 19 09:42:56 crc kubenswrapper[4835]: I0319 09:42:56.248475 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9rjkz" podStartSLOduration=2.465654512 podStartE2EDuration="6.248453792s" podCreationTimestamp="2026-03-19 09:42:50 +0000 UTC" firstStartedPulling="2026-03-19 09:42:51.278728109 +0000 UTC m=+1226.127326696" lastFinishedPulling="2026-03-19 09:42:55.061527389 +0000 UTC m=+1229.910125976" observedRunningTime="2026-03-19 09:42:56.245934015 +0000 UTC m=+1231.094532602" watchObservedRunningTime="2026-03-19 09:42:56.248453792 +0000 UTC m=+1231.097052379" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.124896 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.151109 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7pjhg" podStartSLOduration=4.004603071 podStartE2EDuration="4.151087414s" podCreationTimestamp="2026-03-19 09:42:53 +0000 UTC" firstStartedPulling="2026-03-19 09:42:55.474616518 +0000 UTC m=+1230.323215105" lastFinishedPulling="2026-03-19 09:42:55.621100861 +0000 UTC m=+1230.469699448" observedRunningTime="2026-03-19 09:42:56.259575181 +0000 UTC m=+1231.108173798" watchObservedRunningTime="2026-03-19 09:42:57.151087414 +0000 UTC m=+1231.999686011" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.237529 4835 generic.go:334] "Generic (PLEG): container finished" podID="d0de5d5a-7770-412f-848c-e9f60f8c7307" containerID="faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee" exitCode=0 Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.237581 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9rjkz" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.237618 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9rjkz" event={"ID":"d0de5d5a-7770-412f-848c-e9f60f8c7307","Type":"ContainerDied","Data":"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee"} Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.237647 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9rjkz" event={"ID":"d0de5d5a-7770-412f-848c-e9f60f8c7307","Type":"ContainerDied","Data":"7f223a54e048e4b5b5f12e3139f6609ab2d7535aea153f46ac3345cd16c4f859"} Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.237668 4835 scope.go:117] "RemoveContainer" containerID="faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.257577 4835 scope.go:117] "RemoveContainer" containerID="faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee" Mar 19 09:42:57 crc kubenswrapper[4835]: E0319 09:42:57.258047 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee\": container with ID starting with faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee not found: ID does not exist" containerID="faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.258139 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee"} err="failed to get container status \"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee\": rpc error: code = NotFound desc = could not find container \"faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee\": container with ID starting with faf654d6387bc985c68d8ffb05030cb75d402a0c220be5845f6524821b7619ee not found: ID does not exist" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.262024 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7fll\" (UniqueName: \"kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll\") pod \"d0de5d5a-7770-412f-848c-e9f60f8c7307\" (UID: \"d0de5d5a-7770-412f-848c-e9f60f8c7307\") " Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.267031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll" (OuterVolumeSpecName: "kube-api-access-p7fll") pod "d0de5d5a-7770-412f-848c-e9f60f8c7307" (UID: "d0de5d5a-7770-412f-848c-e9f60f8c7307"). InnerVolumeSpecName "kube-api-access-p7fll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.364171 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7fll\" (UniqueName: \"kubernetes.io/projected/d0de5d5a-7770-412f-848c-e9f60f8c7307-kube-api-access-p7fll\") on node \"crc\" DevicePath \"\"" Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.572482 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:57 crc kubenswrapper[4835]: I0319 09:42:57.578943 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9rjkz"] Mar 19 09:42:58 crc kubenswrapper[4835]: I0319 09:42:58.418961 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0de5d5a-7770-412f-848c-e9f60f8c7307" path="/var/lib/kubelet/pods/d0de5d5a-7770-412f-848c-e9f60f8c7307/volumes" Mar 19 09:43:03 crc kubenswrapper[4835]: I0319 09:43:03.831733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:43:03 crc kubenswrapper[4835]: I0319 09:43:03.832232 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:43:03 crc kubenswrapper[4835]: I0319 09:43:03.867662 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:43:04 crc kubenswrapper[4835]: I0319 09:43:04.363151 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 09:43:04 crc kubenswrapper[4835]: I0319 09:43:04.477226 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-44rxf" Mar 19 09:43:06 crc kubenswrapper[4835]: I0319 09:43:06.422094 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:43:06 crc kubenswrapper[4835]: I0319 09:43:06.422382 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:43:06 crc kubenswrapper[4835]: I0319 09:43:06.422416 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:43:06 crc kubenswrapper[4835]: I0319 09:43:06.423204 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:43:06 crc kubenswrapper[4835]: I0319 09:43:06.423270 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f" gracePeriod=600 Mar 19 09:43:07 crc kubenswrapper[4835]: I0319 09:43:07.348443 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f" exitCode=0 Mar 19 09:43:07 crc kubenswrapper[4835]: I0319 09:43:07.348552 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f"} Mar 19 09:43:07 crc kubenswrapper[4835]: I0319 09:43:07.349391 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9"} Mar 19 09:43:07 crc kubenswrapper[4835]: I0319 09:43:07.349429 4835 scope.go:117] "RemoveContainer" containerID="ad7bf0e681d0b5c56ea72d8f084e643a38eb7b2896b2748995289d8ab657401b" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.320686 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6"] Mar 19 09:43:10 crc kubenswrapper[4835]: E0319 09:43:10.321347 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0de5d5a-7770-412f-848c-e9f60f8c7307" containerName="registry-server" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.321362 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0de5d5a-7770-412f-848c-e9f60f8c7307" containerName="registry-server" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.321492 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0de5d5a-7770-412f-848c-e9f60f8c7307" containerName="registry-server" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.322559 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.324531 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-q7bjz" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.332992 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6"] Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.426882 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlzm7\" (UniqueName: \"kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.427030 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.427164 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.528573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.528655 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.528776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlzm7\" (UniqueName: \"kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.529858 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.529884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.548396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlzm7\" (UniqueName: \"kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7\") pod \"b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:10 crc kubenswrapper[4835]: I0319 09:43:10.688147 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:11 crc kubenswrapper[4835]: I0319 09:43:11.115617 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6"] Mar 19 09:43:11 crc kubenswrapper[4835]: I0319 09:43:11.387291 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerStarted","Data":"12efcc57245a5ebb0a49f2b43d346d52583cb3090913aead19a50e5cf28dac3e"} Mar 19 09:43:11 crc kubenswrapper[4835]: I0319 09:43:11.387622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerStarted","Data":"93e85082dec8205dec74d9f63629fab1f6bf25cccefba43d2ecd199ce4fa9b8d"} Mar 19 09:43:12 crc kubenswrapper[4835]: I0319 09:43:12.400292 4835 generic.go:334] "Generic (PLEG): container finished" podID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerID="12efcc57245a5ebb0a49f2b43d346d52583cb3090913aead19a50e5cf28dac3e" exitCode=0 Mar 19 09:43:12 crc kubenswrapper[4835]: I0319 09:43:12.400524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerDied","Data":"12efcc57245a5ebb0a49f2b43d346d52583cb3090913aead19a50e5cf28dac3e"} Mar 19 09:43:13 crc kubenswrapper[4835]: I0319 09:43:13.413594 4835 generic.go:334] "Generic (PLEG): container finished" podID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerID="9639e9715aeefcb0ddce524c0d926a9657ddd10af724101a1370afbb6c6fc0ab" exitCode=0 Mar 19 09:43:13 crc kubenswrapper[4835]: I0319 09:43:13.413728 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerDied","Data":"9639e9715aeefcb0ddce524c0d926a9657ddd10af724101a1370afbb6c6fc0ab"} Mar 19 09:43:14 crc kubenswrapper[4835]: I0319 09:43:14.423633 4835 generic.go:334] "Generic (PLEG): container finished" podID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerID="8b7ff52ae0a1cda97e5fe35480dd26e0e10b9991ba37812ea07738a5492bc314" exitCode=0 Mar 19 09:43:14 crc kubenswrapper[4835]: I0319 09:43:14.423691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerDied","Data":"8b7ff52ae0a1cda97e5fe35480dd26e0e10b9991ba37812ea07738a5492bc314"} Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.807512 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.922473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle\") pod \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.922930 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlzm7\" (UniqueName: \"kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7\") pod \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.923038 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util\") pod \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\" (UID: \"a8c35d20-da98-4fdc-9e0b-1d3dea603055\") " Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.923920 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle" (OuterVolumeSpecName: "bundle") pod "a8c35d20-da98-4fdc-9e0b-1d3dea603055" (UID: "a8c35d20-da98-4fdc-9e0b-1d3dea603055"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.932985 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7" (OuterVolumeSpecName: "kube-api-access-qlzm7") pod "a8c35d20-da98-4fdc-9e0b-1d3dea603055" (UID: "a8c35d20-da98-4fdc-9e0b-1d3dea603055"). InnerVolumeSpecName "kube-api-access-qlzm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:43:15 crc kubenswrapper[4835]: I0319 09:43:15.940329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util" (OuterVolumeSpecName: "util") pod "a8c35d20-da98-4fdc-9e0b-1d3dea603055" (UID: "a8c35d20-da98-4fdc-9e0b-1d3dea603055"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.024815 4835 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-util\") on node \"crc\" DevicePath \"\"" Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.024862 4835 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8c35d20-da98-4fdc-9e0b-1d3dea603055-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.024881 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlzm7\" (UniqueName: \"kubernetes.io/projected/a8c35d20-da98-4fdc-9e0b-1d3dea603055-kube-api-access-qlzm7\") on node \"crc\" DevicePath \"\"" Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.451568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" event={"ID":"a8c35d20-da98-4fdc-9e0b-1d3dea603055","Type":"ContainerDied","Data":"93e85082dec8205dec74d9f63629fab1f6bf25cccefba43d2ecd199ce4fa9b8d"} Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.451772 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e85082dec8205dec74d9f63629fab1f6bf25cccefba43d2ecd199ce4fa9b8d" Mar 19 09:43:16 crc kubenswrapper[4835]: I0319 09:43:16.451635 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.589089 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6674476dff-54wn4"] Mar 19 09:43:22 crc kubenswrapper[4835]: E0319 09:43:22.589856 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="util" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.589867 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="util" Mar 19 09:43:22 crc kubenswrapper[4835]: E0319 09:43:22.589886 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="pull" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.589892 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="pull" Mar 19 09:43:22 crc kubenswrapper[4835]: E0319 09:43:22.589906 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="extract" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.589913 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="extract" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.590074 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c35d20-da98-4fdc-9e0b-1d3dea603055" containerName="extract" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.590595 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.593347 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-jd2v4" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.626966 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6674476dff-54wn4"] Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.655491 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdc8x\" (UniqueName: \"kubernetes.io/projected/2021ffd4-3c7f-477a-8816-c9b382d640c7-kube-api-access-kdc8x\") pod \"openstack-operator-controller-init-6674476dff-54wn4\" (UID: \"2021ffd4-3c7f-477a-8816-c9b382d640c7\") " pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.756200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdc8x\" (UniqueName: \"kubernetes.io/projected/2021ffd4-3c7f-477a-8816-c9b382d640c7-kube-api-access-kdc8x\") pod \"openstack-operator-controller-init-6674476dff-54wn4\" (UID: \"2021ffd4-3c7f-477a-8816-c9b382d640c7\") " pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.779421 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdc8x\" (UniqueName: \"kubernetes.io/projected/2021ffd4-3c7f-477a-8816-c9b382d640c7-kube-api-access-kdc8x\") pod \"openstack-operator-controller-init-6674476dff-54wn4\" (UID: \"2021ffd4-3c7f-477a-8816-c9b382d640c7\") " pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:22 crc kubenswrapper[4835]: I0319 09:43:22.909722 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:23 crc kubenswrapper[4835]: I0319 09:43:23.418770 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6674476dff-54wn4"] Mar 19 09:43:23 crc kubenswrapper[4835]: I0319 09:43:23.543317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" event={"ID":"2021ffd4-3c7f-477a-8816-c9b382d640c7","Type":"ContainerStarted","Data":"78607e102154e43a0cda24858f8b7b70583be76f177737fbb41abaa8e5ca7588"} Mar 19 09:43:27 crc kubenswrapper[4835]: I0319 09:43:27.597149 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" event={"ID":"2021ffd4-3c7f-477a-8816-c9b382d640c7","Type":"ContainerStarted","Data":"979bb4f3856fc6bb0ec55a058f4e39f26f93267184523224e57fd0d91ae6cfe5"} Mar 19 09:43:27 crc kubenswrapper[4835]: I0319 09:43:27.598624 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:32 crc kubenswrapper[4835]: I0319 09:43:32.913153 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 09:43:32 crc kubenswrapper[4835]: I0319 09:43:32.970316 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podStartSLOduration=7.182046955 podStartE2EDuration="10.970284273s" podCreationTimestamp="2026-03-19 09:43:22 +0000 UTC" firstStartedPulling="2026-03-19 09:43:23.454879797 +0000 UTC m=+1258.303478384" lastFinishedPulling="2026-03-19 09:43:27.243117115 +0000 UTC m=+1262.091715702" observedRunningTime="2026-03-19 09:43:27.635839368 +0000 UTC m=+1262.484438005" watchObservedRunningTime="2026-03-19 09:43:32.970284273 +0000 UTC m=+1267.818882900" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.810275 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.812111 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.814022 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6lv5g" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.828465 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.857828 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.859118 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.861283 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-j5ztl" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.867866 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbspl\" (UniqueName: \"kubernetes.io/projected/e01500c9-ffdf-42df-9017-26b813efed0e-kube-api-access-cbspl\") pod \"barbican-operator-controller-manager-5cfd84c587-t9dbc\" (UID: \"e01500c9-ffdf-42df-9017-26b813efed0e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.876658 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.877887 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.886921 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2g2tk" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.896814 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.921879 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.943649 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.944990 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.949506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-87btj" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.969912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft2rb\" (UniqueName: \"kubernetes.io/projected/9eeda748-4895-4cbc-be03-d1edabf69758-kube-api-access-ft2rb\") pod \"glance-operator-controller-manager-7d559dcdbd-9rtgc\" (UID: \"9eeda748-4895-4cbc-be03-d1edabf69758\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.970016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbspl\" (UniqueName: \"kubernetes.io/projected/e01500c9-ffdf-42df-9017-26b813efed0e-kube-api-access-cbspl\") pod \"barbican-operator-controller-manager-5cfd84c587-t9dbc\" (UID: \"e01500c9-ffdf-42df-9017-26b813efed0e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.970046 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2tv\" (UniqueName: \"kubernetes.io/projected/5db66af1-4929-4163-b788-0320b0323a79-kube-api-access-wd2tv\") pod \"designate-operator-controller-manager-6cc65c69fc-bzhnq\" (UID: \"5db66af1-4929-4163-b788-0320b0323a79\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.970088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/0b55921b-5660-44b9-abe9-09639766068c-kube-api-access-h5hk6\") pod \"cinder-operator-controller-manager-6d77645966-dhr9f\" (UID: \"0b55921b-5660-44b9-abe9-09639766068c\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.981823 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww"] Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.983220 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:43:52 crc kubenswrapper[4835]: I0319 09:43:52.986783 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-zm6sp" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.007063 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.042808 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.043502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbspl\" (UniqueName: \"kubernetes.io/projected/e01500c9-ffdf-42df-9017-26b813efed0e-kube-api-access-cbspl\") pod \"barbican-operator-controller-manager-5cfd84c587-t9dbc\" (UID: \"e01500c9-ffdf-42df-9017-26b813efed0e\") " pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.044045 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.054528 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rw7mj" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.078498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztr45\" (UniqueName: \"kubernetes.io/projected/47fe8883-3cea-4985-8327-4ad721d4e128-kube-api-access-ztr45\") pod \"heat-operator-controller-manager-66dd9d474d-bjnww\" (UID: \"47fe8883-3cea-4985-8327-4ad721d4e128\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.078848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft2rb\" (UniqueName: \"kubernetes.io/projected/9eeda748-4895-4cbc-be03-d1edabf69758-kube-api-access-ft2rb\") pod \"glance-operator-controller-manager-7d559dcdbd-9rtgc\" (UID: \"9eeda748-4895-4cbc-be03-d1edabf69758\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.079034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2tv\" (UniqueName: \"kubernetes.io/projected/5db66af1-4929-4163-b788-0320b0323a79-kube-api-access-wd2tv\") pod \"designate-operator-controller-manager-6cc65c69fc-bzhnq\" (UID: \"5db66af1-4929-4163-b788-0320b0323a79\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.079151 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brjbw\" (UniqueName: \"kubernetes.io/projected/6dabbeda-38a8-4289-92b6-075d7c20f958-kube-api-access-brjbw\") pod \"horizon-operator-controller-manager-64dc66d669-msxkb\" (UID: \"6dabbeda-38a8-4289-92b6-075d7c20f958\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.079264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/0b55921b-5660-44b9-abe9-09639766068c-kube-api-access-h5hk6\") pod \"cinder-operator-controller-manager-6d77645966-dhr9f\" (UID: \"0b55921b-5660-44b9-abe9-09639766068c\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.100490 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.109298 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.110690 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.133764 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.134110 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.134582 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hk9bw" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.149593 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.185328 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.208963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztr45\" (UniqueName: \"kubernetes.io/projected/47fe8883-3cea-4985-8327-4ad721d4e128-kube-api-access-ztr45\") pod \"heat-operator-controller-manager-66dd9d474d-bjnww\" (UID: \"47fe8883-3cea-4985-8327-4ad721d4e128\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.209187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brjbw\" (UniqueName: \"kubernetes.io/projected/6dabbeda-38a8-4289-92b6-075d7c20f958-kube-api-access-brjbw\") pod \"horizon-operator-controller-manager-64dc66d669-msxkb\" (UID: \"6dabbeda-38a8-4289-92b6-075d7c20f958\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.216912 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hk6\" (UniqueName: \"kubernetes.io/projected/0b55921b-5660-44b9-abe9-09639766068c-kube-api-access-h5hk6\") pod \"cinder-operator-controller-manager-6d77645966-dhr9f\" (UID: \"0b55921b-5660-44b9-abe9-09639766068c\") " pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.217527 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2tv\" (UniqueName: \"kubernetes.io/projected/5db66af1-4929-4163-b788-0320b0323a79-kube-api-access-wd2tv\") pod \"designate-operator-controller-manager-6cc65c69fc-bzhnq\" (UID: \"5db66af1-4929-4163-b788-0320b0323a79\") " pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.231979 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.259425 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft2rb\" (UniqueName: \"kubernetes.io/projected/9eeda748-4895-4cbc-be03-d1edabf69758-kube-api-access-ft2rb\") pod \"glance-operator-controller-manager-7d559dcdbd-9rtgc\" (UID: \"9eeda748-4895-4cbc-be03-d1edabf69758\") " pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.292424 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.293915 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rznmt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.294763 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.294845 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.302244 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brjbw\" (UniqueName: \"kubernetes.io/projected/6dabbeda-38a8-4289-92b6-075d7c20f958-kube-api-access-brjbw\") pod \"horizon-operator-controller-manager-64dc66d669-msxkb\" (UID: \"6dabbeda-38a8-4289-92b6-075d7c20f958\") " pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.303031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.303416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.310829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.311351 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brxh\" (UniqueName: \"kubernetes.io/projected/aea377c2-d271-45c9-a574-0d1fe89caac5-kube-api-access-5brxh\") pod \"ironic-operator-controller-manager-6b77b7676d-mgjjt\" (UID: \"aea377c2-d271-45c9-a574-0d1fe89caac5\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.311401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wppl\" (UniqueName: \"kubernetes.io/projected/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-kube-api-access-2wppl\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.317079 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7mw7l" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.355502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.383036 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.386328 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztr45\" (UniqueName: \"kubernetes.io/projected/47fe8883-3cea-4985-8327-4ad721d4e128-kube-api-access-ztr45\") pod \"heat-operator-controller-manager-66dd9d474d-bjnww\" (UID: \"47fe8883-3cea-4985-8327-4ad721d4e128\") " pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.388702 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.400821 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.402291 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.409761 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-km4jv" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.426512 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6ds\" (UniqueName: \"kubernetes.io/projected/0f2be57b-1212-4dc2-b772-3317bcd69aac-kube-api-access-2m6ds\") pod \"keystone-operator-controller-manager-76b87776c9-79r7z\" (UID: \"0f2be57b-1212-4dc2-b772-3317bcd69aac\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.426564 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brxh\" (UniqueName: \"kubernetes.io/projected/aea377c2-d271-45c9-a574-0d1fe89caac5-kube-api-access-5brxh\") pod \"ironic-operator-controller-manager-6b77b7676d-mgjjt\" (UID: \"aea377c2-d271-45c9-a574-0d1fe89caac5\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.426590 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wppl\" (UniqueName: \"kubernetes.io/projected/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-kube-api-access-2wppl\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.426625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.426829 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.426882 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:53.926858866 +0000 UTC m=+1288.775457453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.450802 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.452264 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.457289 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.459679 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.464738 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.470207 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-d4clz" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.470926 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.474905 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zmljv" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.487323 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brxh\" (UniqueName: \"kubernetes.io/projected/aea377c2-d271-45c9-a574-0d1fe89caac5-kube-api-access-5brxh\") pod \"ironic-operator-controller-manager-6b77b7676d-mgjjt\" (UID: \"aea377c2-d271-45c9-a574-0d1fe89caac5\") " pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.498303 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.504031 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.505981 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wppl\" (UniqueName: \"kubernetes.io/projected/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-kube-api-access-2wppl\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.528569 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dgp\" (UniqueName: \"kubernetes.io/projected/e86ab409-7643-4e6f-9129-6f90e5b6bf1c-kube-api-access-l5dgp\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-d79pg\" (UID: \"e86ab409-7643-4e6f-9129-6f90e5b6bf1c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.547401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk64p\" (UniqueName: \"kubernetes.io/projected/2e429fe7-b053-43e9-a640-f14af7094e62-kube-api-access-hk64p\") pod \"neutron-operator-controller-manager-6744dd545c-4nqvq\" (UID: \"2e429fe7-b053-43e9-a640-f14af7094e62\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.547444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6ds\" (UniqueName: \"kubernetes.io/projected/0f2be57b-1212-4dc2-b772-3317bcd69aac-kube-api-access-2m6ds\") pod \"keystone-operator-controller-manager-76b87776c9-79r7z\" (UID: \"0f2be57b-1212-4dc2-b772-3317bcd69aac\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.547495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm944\" (UniqueName: \"kubernetes.io/projected/a73c4956-ad63-40d9-907e-599172ac3771-kube-api-access-dm944\") pod \"manila-operator-controller-manager-fbf7bbb96-v6mc6\" (UID: \"a73c4956-ad63-40d9-907e-599172ac3771\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.543524 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.548509 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.551874 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7mj55" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.563182 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.611996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6ds\" (UniqueName: \"kubernetes.io/projected/0f2be57b-1212-4dc2-b772-3317bcd69aac-kube-api-access-2m6ds\") pod \"keystone-operator-controller-manager-76b87776c9-79r7z\" (UID: \"0f2be57b-1212-4dc2-b772-3317bcd69aac\") " pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.619953 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.627452 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.629225 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.637722 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-flghl" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.650610 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dgp\" (UniqueName: \"kubernetes.io/projected/e86ab409-7643-4e6f-9129-6f90e5b6bf1c-kube-api-access-l5dgp\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-d79pg\" (UID: \"e86ab409-7643-4e6f-9129-6f90e5b6bf1c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.650725 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk64p\" (UniqueName: \"kubernetes.io/projected/2e429fe7-b053-43e9-a640-f14af7094e62-kube-api-access-hk64p\") pod \"neutron-operator-controller-manager-6744dd545c-4nqvq\" (UID: \"2e429fe7-b053-43e9-a640-f14af7094e62\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.650779 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm944\" (UniqueName: \"kubernetes.io/projected/a73c4956-ad63-40d9-907e-599172ac3771-kube-api-access-dm944\") pod \"manila-operator-controller-manager-fbf7bbb96-v6mc6\" (UID: \"a73c4956-ad63-40d9-907e-599172ac3771\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.650836 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h9j\" (UniqueName: \"kubernetes.io/projected/2173f655-b411-4dfe-8247-59ac02942c25-kube-api-access-28h9j\") pod \"nova-operator-controller-manager-bc5c78db9-wvb5n\" (UID: \"2173f655-b411-4dfe-8247-59ac02942c25\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.674396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.705520 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.707027 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.710863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk64p\" (UniqueName: \"kubernetes.io/projected/2e429fe7-b053-43e9-a640-f14af7094e62-kube-api-access-hk64p\") pod \"neutron-operator-controller-manager-6744dd545c-4nqvq\" (UID: \"2e429fe7-b053-43e9-a640-f14af7094e62\") " pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.711360 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dgp\" (UniqueName: \"kubernetes.io/projected/e86ab409-7643-4e6f-9129-6f90e5b6bf1c-kube-api-access-l5dgp\") pod \"mariadb-operator-controller-manager-6f5b7bcd4-d79pg\" (UID: \"e86ab409-7643-4e6f-9129-6f90e5b6bf1c\") " pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.717474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm944\" (UniqueName: \"kubernetes.io/projected/a73c4956-ad63-40d9-907e-599172ac3771-kube-api-access-dm944\") pod \"manila-operator-controller-manager-fbf7bbb96-v6mc6\" (UID: \"a73c4956-ad63-40d9-907e-599172ac3771\") " pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.739108 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.741200 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.749968 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.750139 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fzn7q" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.751514 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.753241 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rd4cv" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.753422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h9j\" (UniqueName: \"kubernetes.io/projected/2173f655-b411-4dfe-8247-59ac02942c25-kube-api-access-28h9j\") pod \"nova-operator-controller-manager-bc5c78db9-wvb5n\" (UID: \"2173f655-b411-4dfe-8247-59ac02942c25\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.753592 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh22d\" (UniqueName: \"kubernetes.io/projected/02720ef5-c00c-48a4-af96-4d82f28bf051-kube-api-access-dh22d\") pod \"octavia-operator-controller-manager-56f74467c6-v6bxs\" (UID: \"02720ef5-c00c-48a4-af96-4d82f28bf051\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.765653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.771291 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.784585 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.791768 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h9j\" (UniqueName: \"kubernetes.io/projected/2173f655-b411-4dfe-8247-59ac02942c25-kube-api-access-28h9j\") pod \"nova-operator-controller-manager-bc5c78db9-wvb5n\" (UID: \"2173f655-b411-4dfe-8247-59ac02942c25\") " pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.794972 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.799088 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-8kqp7" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.799268 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.818756 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.828378 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.829764 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.834886 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-v2tnd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.854736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcnsc\" (UniqueName: \"kubernetes.io/projected/dd4067de-94b9-4d91-bdf4-e3e3af91b76f-kube-api-access-bcnsc\") pod \"placement-operator-controller-manager-659fb58c6b-vq45h\" (UID: \"dd4067de-94b9-4d91-bdf4-e3e3af91b76f\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.854814 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.854884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5c2\" (UniqueName: \"kubernetes.io/projected/017e5e75-5952-4240-8349-d16c367d4bed-kube-api-access-2x5c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.854935 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jccwd\" (UniqueName: \"kubernetes.io/projected/74161dd4-df25-4f1b-8924-c9086688463d-kube-api-access-jccwd\") pod \"ovn-operator-controller-manager-846c4cdcb7-7b7zt\" (UID: \"74161dd4-df25-4f1b-8924-c9086688463d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.855092 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh22d\" (UniqueName: \"kubernetes.io/projected/02720ef5-c00c-48a4-af96-4d82f28bf051-kube-api-access-dh22d\") pod \"octavia-operator-controller-manager-56f74467c6-v6bxs\" (UID: \"02720ef5-c00c-48a4-af96-4d82f28bf051\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.855500 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.870140 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.881818 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.887053 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.904752 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh22d\" (UniqueName: \"kubernetes.io/projected/02720ef5-c00c-48a4-af96-4d82f28bf051-kube-api-access-dh22d\") pod \"octavia-operator-controller-manager-56f74467c6-v6bxs\" (UID: \"02720ef5-c00c-48a4-af96-4d82f28bf051\") " pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.904819 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.906317 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.910060 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.916234 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gpmcm" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.921559 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.922671 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.933322 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hqnq8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.933487 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.938150 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.947659 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.950531 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.954217 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dbndd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958576 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8s56\" (UniqueName: \"kubernetes.io/projected/637282e5-c56d-49ea-ac96-f299fb3661f2-kube-api-access-b8s56\") pod \"swift-operator-controller-manager-867f54bc44-sgjfn\" (UID: \"637282e5-c56d-49ea-ac96-f299fb3661f2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcnsc\" (UniqueName: \"kubernetes.io/projected/dd4067de-94b9-4d91-bdf4-e3e3af91b76f-kube-api-access-bcnsc\") pod \"placement-operator-controller-manager-659fb58c6b-vq45h\" (UID: \"dd4067de-94b9-4d91-bdf4-e3e3af91b76f\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958687 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5c2\" (UniqueName: \"kubernetes.io/projected/017e5e75-5952-4240-8349-d16c367d4bed-kube-api-access-2x5c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.958824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jccwd\" (UniqueName: \"kubernetes.io/projected/74161dd4-df25-4f1b-8924-c9086688463d-kube-api-access-jccwd\") pod \"ovn-operator-controller-manager-846c4cdcb7-7b7zt\" (UID: \"74161dd4-df25-4f1b-8924-c9086688463d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.960713 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.960821 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert podName:017e5e75-5952-4240-8349-d16c367d4bed nodeName:}" failed. No retries permitted until 2026-03-19 09:43:54.46078635 +0000 UTC m=+1289.309384937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" (UID: "017e5e75-5952-4240-8349-d16c367d4bed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.961209 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: E0319 09:43:53.961242 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:54.961234592 +0000 UTC m=+1289.809833179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.967572 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7"] Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.970958 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.981241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jccwd\" (UniqueName: \"kubernetes.io/projected/74161dd4-df25-4f1b-8924-c9086688463d-kube-api-access-jccwd\") pod \"ovn-operator-controller-manager-846c4cdcb7-7b7zt\" (UID: \"74161dd4-df25-4f1b-8924-c9086688463d\") " pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:43:53 crc kubenswrapper[4835]: I0319 09:43:53.992337 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5c2\" (UniqueName: \"kubernetes.io/projected/017e5e75-5952-4240-8349-d16c367d4bed-kube-api-access-2x5c2\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.005361 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcnsc\" (UniqueName: \"kubernetes.io/projected/dd4067de-94b9-4d91-bdf4-e3e3af91b76f-kube-api-access-bcnsc\") pod \"placement-operator-controller-manager-659fb58c6b-vq45h\" (UID: \"dd4067de-94b9-4d91-bdf4-e3e3af91b76f\") " pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.048890 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.050221 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.054483 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-84x79" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.054671 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.054783 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.057427 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.060869 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zdl\" (UniqueName: \"kubernetes.io/projected/a59c67ae-bcf7-404f-ad32-233f38450f65-kube-api-access-s4zdl\") pod \"watcher-operator-controller-manager-74d6f7b5c-64vf7\" (UID: \"a59c67ae-bcf7-404f-ad32-233f38450f65\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.060952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnfhk\" (UniqueName: \"kubernetes.io/projected/65b9e1e9-a183-4d37-a281-593752da6125-kube-api-access-qnfhk\") pod \"telemetry-operator-controller-manager-57b8dbd499-gz2nz\" (UID: \"65b9e1e9-a183-4d37-a281-593752da6125\") " pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.061069 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzks\" (UniqueName: \"kubernetes.io/projected/91c2a119-b77e-4cf1-9be3-779c47d4643b-kube-api-access-5hzks\") pod \"test-operator-controller-manager-8467ccb4c8-s94g8\" (UID: \"91c2a119-b77e-4cf1-9be3-779c47d4643b\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.061394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8s56\" (UniqueName: \"kubernetes.io/projected/637282e5-c56d-49ea-ac96-f299fb3661f2-kube-api-access-b8s56\") pod \"swift-operator-controller-manager-867f54bc44-sgjfn\" (UID: \"637282e5-c56d-49ea-ac96-f299fb3661f2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.074760 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.075981 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.077514 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-t5p26" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.084656 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.086237 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8s56\" (UniqueName: \"kubernetes.io/projected/637282e5-c56d-49ea-ac96-f299fb3661f2-kube-api-access-b8s56\") pod \"swift-operator-controller-manager-867f54bc44-sgjfn\" (UID: \"637282e5-c56d-49ea-ac96-f299fb3661f2\") " pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.094457 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.117844 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.162676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnfhk\" (UniqueName: \"kubernetes.io/projected/65b9e1e9-a183-4d37-a281-593752da6125-kube-api-access-qnfhk\") pod \"telemetry-operator-controller-manager-57b8dbd499-gz2nz\" (UID: \"65b9e1e9-a183-4d37-a281-593752da6125\") " pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163026 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163058 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqx9m\" (UniqueName: \"kubernetes.io/projected/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-kube-api-access-pqx9m\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzks\" (UniqueName: \"kubernetes.io/projected/91c2a119-b77e-4cf1-9be3-779c47d4643b-kube-api-access-5hzks\") pod \"test-operator-controller-manager-8467ccb4c8-s94g8\" (UID: \"91c2a119-b77e-4cf1-9be3-779c47d4643b\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6cd\" (UniqueName: \"kubernetes.io/projected/5ab66f1b-4361-44ac-a885-93bbb8cd51d5-kube-api-access-9k6cd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z9wxq\" (UID: \"5ab66f1b-4361-44ac-a885-93bbb8cd51d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.163229 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4zdl\" (UniqueName: \"kubernetes.io/projected/a59c67ae-bcf7-404f-ad32-233f38450f65-kube-api-access-s4zdl\") pod \"watcher-operator-controller-manager-74d6f7b5c-64vf7\" (UID: \"a59c67ae-bcf7-404f-ad32-233f38450f65\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.170302 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.184393 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4zdl\" (UniqueName: \"kubernetes.io/projected/a59c67ae-bcf7-404f-ad32-233f38450f65-kube-api-access-s4zdl\") pod \"watcher-operator-controller-manager-74d6f7b5c-64vf7\" (UID: \"a59c67ae-bcf7-404f-ad32-233f38450f65\") " pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.187189 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzks\" (UniqueName: \"kubernetes.io/projected/91c2a119-b77e-4cf1-9be3-779c47d4643b-kube-api-access-5hzks\") pod \"test-operator-controller-manager-8467ccb4c8-s94g8\" (UID: \"91c2a119-b77e-4cf1-9be3-779c47d4643b\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.196888 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnfhk\" (UniqueName: \"kubernetes.io/projected/65b9e1e9-a183-4d37-a281-593752da6125-kube-api-access-qnfhk\") pod \"telemetry-operator-controller-manager-57b8dbd499-gz2nz\" (UID: \"65b9e1e9-a183-4d37-a281-593752da6125\") " pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.250250 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.265036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.265093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6cd\" (UniqueName: \"kubernetes.io/projected/5ab66f1b-4361-44ac-a885-93bbb8cd51d5-kube-api-access-9k6cd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z9wxq\" (UID: \"5ab66f1b-4361-44ac-a885-93bbb8cd51d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.265176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.265203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqx9m\" (UniqueName: \"kubernetes.io/projected/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-kube-api-access-pqx9m\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.265588 4835 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.265626 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:54.765612333 +0000 UTC m=+1289.614210920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "metrics-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.265976 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.266004 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:54.765996794 +0000 UTC m=+1289.614595381 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.285869 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6cd\" (UniqueName: \"kubernetes.io/projected/5ab66f1b-4361-44ac-a885-93bbb8cd51d5-kube-api-access-9k6cd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z9wxq\" (UID: \"5ab66f1b-4361-44ac-a885-93bbb8cd51d5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.287896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqx9m\" (UniqueName: \"kubernetes.io/projected/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-kube-api-access-pqx9m\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.288668 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.314779 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.326014 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.399874 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.420130 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.438827 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.472732 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.473723 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.473805 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert podName:017e5e75-5952-4240-8349-d16c367d4bed nodeName:}" failed. No retries permitted until 2026-03-19 09:43:55.473785731 +0000 UTC m=+1290.322384308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" (UID: "017e5e75-5952-4240-8349-d16c367d4bed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.626846 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.780629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.780724 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.780866 4835 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.780924 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:55.780908517 +0000 UTC m=+1290.629507104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "metrics-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.780980 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.781051 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:55.78103281 +0000 UTC m=+1290.629631397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.781204 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq"] Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.850621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" event={"ID":"5db66af1-4929-4163-b788-0320b0323a79","Type":"ContainerStarted","Data":"bb44905c642e04eb86fb5165e66c4194d467c45f9ec99cb5705b12fb18cc3858"} Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.854041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" event={"ID":"e01500c9-ffdf-42df-9017-26b813efed0e","Type":"ContainerStarted","Data":"52cd5b35ba22c5f1a49f3d8fbcf3cfd3defa8daa1ff52954ea2b1136f916bb62"} Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.855582 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" event={"ID":"6dabbeda-38a8-4289-92b6-075d7c20f958","Type":"ContainerStarted","Data":"650ac99e0bd47ee4345026057d8e53d30454dfa1d3b1c344da37449be5caff61"} Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.856636 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" event={"ID":"9eeda748-4895-4cbc-be03-d1edabf69758","Type":"ContainerStarted","Data":"f2ee8d5707516d59411183b7be29e09364cac962a8bfd41af5a08f54ead1a4b4"} Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.860381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" event={"ID":"0b55921b-5660-44b9-abe9-09639766068c","Type":"ContainerStarted","Data":"bf701bf44cc67b0e7c64b5fa0c9a6912a3add1478163c6100e99f0e3b1936a51"} Mar 19 09:43:54 crc kubenswrapper[4835]: I0319 09:43:54.984472 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.984797 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:54 crc kubenswrapper[4835]: E0319 09:43:54.984870 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:56.984846321 +0000 UTC m=+1291.833444918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.096689 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt"] Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.109220 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea377c2_d271_45c9_a574_0d1fe89caac5.slice/crio-7722a67acc83734052aa45887521e186cde264ab5b3100c946ebfa3d7b0ceb6e WatchSource:0}: Error finding container 7722a67acc83734052aa45887521e186cde264ab5b3100c946ebfa3d7b0ceb6e: Status 404 returned error can't find the container with id 7722a67acc83734052aa45887521e186cde264ab5b3100c946ebfa3d7b0ceb6e Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.116580 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n"] Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.117011 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda73c4956_ad63_40d9_907e_599172ac3771.slice/crio-4ddeddeaf3fa4dfc903408f9db086b78a2535220966b75faa97abd57e86d3b9e WatchSource:0}: Error finding container 4ddeddeaf3fa4dfc903408f9db086b78a2535220966b75faa97abd57e86d3b9e: Status 404 returned error can't find the container with id 4ddeddeaf3fa4dfc903408f9db086b78a2535220966b75faa97abd57e86d3b9e Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.123894 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47fe8883_3cea_4985_8327_4ad721d4e128.slice/crio-ef0e28c5efdbce7578351dd6d7686e73eade07a4986182a356f0ec992db86aef WatchSource:0}: Error finding container ef0e28c5efdbce7578351dd6d7686e73eade07a4986182a356f0ec992db86aef: Status 404 returned error can't find the container with id ef0e28c5efdbce7578351dd6d7686e73eade07a4986182a356f0ec992db86aef Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.125993 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww"] Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.135395 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2be57b_1212_4dc2_b772_3317bcd69aac.slice/crio-2071739ad6856494a720a4c2856f7489bf01ef1ae34c3c3577cc7f1a60460c0f WatchSource:0}: Error finding container 2071739ad6856494a720a4c2856f7489bf01ef1ae34c3c3577cc7f1a60460c0f: Status 404 returned error can't find the container with id 2071739ad6856494a720a4c2856f7489bf01ef1ae34c3c3577cc7f1a60460c0f Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.143289 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.151122 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.493853 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.494061 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.494145 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert podName:017e5e75-5952-4240-8349-d16c367d4bed nodeName:}" failed. No retries permitted until 2026-03-19 09:43:57.494122123 +0000 UTC m=+1292.342720710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" (UID: "017e5e75-5952-4240-8349-d16c367d4bed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.703196 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.738239 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.800651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.800793 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.801044 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.801163 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:57.801134085 +0000 UTC m=+1292.649732702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "webhook-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.801075 4835 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.801247 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:43:57.801230187 +0000 UTC m=+1292.649828774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "metrics-server-cert" not found Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.879540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" event={"ID":"aea377c2-d271-45c9-a574-0d1fe89caac5","Type":"ContainerStarted","Data":"7722a67acc83734052aa45887521e186cde264ab5b3100c946ebfa3d7b0ceb6e"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.881986 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" event={"ID":"2173f655-b411-4dfe-8247-59ac02942c25","Type":"ContainerStarted","Data":"a77a9c894ab3691d8985d3d7a1e3dc7357a3420a4aa41a2a8dc33d97ae281d78"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.886142 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" event={"ID":"0f2be57b-1212-4dc2-b772-3317bcd69aac","Type":"ContainerStarted","Data":"2071739ad6856494a720a4c2856f7489bf01ef1ae34c3c3577cc7f1a60460c0f"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.888834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" event={"ID":"2e429fe7-b053-43e9-a640-f14af7094e62","Type":"ContainerStarted","Data":"4aee79c9452471c9c0f1e75180730c2e2a81adf196449e9676e1121e21915d42"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.890428 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.891304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" event={"ID":"47fe8883-3cea-4985-8327-4ad721d4e128","Type":"ContainerStarted","Data":"ef0e28c5efdbce7578351dd6d7686e73eade07a4986182a356f0ec992db86aef"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.893787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" event={"ID":"a73c4956-ad63-40d9-907e-599172ac3771","Type":"ContainerStarted","Data":"4ddeddeaf3fa4dfc903408f9db086b78a2535220966b75faa97abd57e86d3b9e"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.897650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" event={"ID":"637282e5-c56d-49ea-ac96-f299fb3661f2","Type":"ContainerStarted","Data":"6ab974605119fbfc4134c9f7dc4ba38adb6fb670a8530c022875a573475d7cc3"} Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.898676 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.911421 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.917540 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg"] Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.937284 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86ab409_7643_4e6f_9129_6f90e5b6bf1c.slice/crio-3377eeec82652625b414d2481a67a3b8325cc31ef9b72cbc815b836408d76769 WatchSource:0}: Error finding container 3377eeec82652625b414d2481a67a3b8325cc31ef9b72cbc815b836408d76769: Status 404 returned error can't find the container with id 3377eeec82652625b414d2481a67a3b8325cc31ef9b72cbc815b836408d76769 Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.942810 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq"] Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.946345 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda59c67ae_bcf7_404f_ad32_233f38450f65.slice/crio-941674d40edb75870633b9a7f1a4186544f31a54e19d7e303047b9d8efb0802a WatchSource:0}: Error finding container 941674d40edb75870633b9a7f1a4186544f31a54e19d7e303047b9d8efb0802a: Status 404 returned error can't find the container with id 941674d40edb75870633b9a7f1a4186544f31a54e19d7e303047b9d8efb0802a Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.948790 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab66f1b_4361_44ac_a885_93bbb8cd51d5.slice/crio-058b3a3d65fefe13ab6b700f275f0d6ebf8678b238f3d28e8db8444b305cae38 WatchSource:0}: Error finding container 058b3a3d65fefe13ab6b700f275f0d6ebf8678b238f3d28e8db8444b305cae38: Status 404 returned error can't find the container with id 058b3a3d65fefe13ab6b700f275f0d6ebf8678b238f3d28e8db8444b305cae38 Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.949362 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74161dd4_df25_4f1b_8924_c9086688463d.slice/crio-af4961403515d2a587f5ccd918986331f968c44b0709f0e67c11e79b2700f3f9 WatchSource:0}: Error finding container af4961403515d2a587f5ccd918986331f968c44b0709f0e67c11e79b2700f3f9: Status 404 returned error can't find the container with id af4961403515d2a587f5ccd918986331f968c44b0709f0e67c11e79b2700f3f9 Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.950230 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b9e1e9_a183_4d37_a281_593752da6125.slice/crio-9c26b29e1fdf253cd742ad4a22fd50b4ae8194b24d847edfdf3976a51632dd67 WatchSource:0}: Error finding container 9c26b29e1fdf253cd742ad4a22fd50b4ae8194b24d847edfdf3976a51632dd67: Status 404 returned error can't find the container with id 9c26b29e1fdf253cd742ad4a22fd50b4ae8194b24d847edfdf3976a51632dd67 Mar 19 09:43:55 crc kubenswrapper[4835]: W0319 09:43:55.952595 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd4067de_94b9_4d91_bdf4_e3e3af91b76f.slice/crio-117f55c84897a85f9719b5dc2295419e6ba6f653aae31e7ebbb182d01245ec83 WatchSource:0}: Error finding container 117f55c84897a85f9719b5dc2295419e6ba6f653aae31e7ebbb182d01245ec83: Status 404 returned error can't find the container with id 117f55c84897a85f9719b5dc2295419e6ba6f653aae31e7ebbb182d01245ec83 Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.954776 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.182:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qnfhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-57b8dbd499-gz2nz_openstack-operators(65b9e1e9-a183-4d37-a281-593752da6125): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.956164 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.956383 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bcnsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-659fb58c6b-vq45h_openstack-operators(dd4067de-94b9-4d91-bdf4-e3e3af91b76f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.956897 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4zdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-74d6f7b5c-64vf7_openstack-operators(a59c67ae-bcf7-404f-ad32-233f38450f65): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.957191 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h"] Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.958167 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" Mar 19 09:43:55 crc kubenswrapper[4835]: E0319 09:43:55.958204 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.972255 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt"] Mar 19 09:43:55 crc kubenswrapper[4835]: I0319 09:43:55.983935 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz"] Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.917005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" event={"ID":"dd4067de-94b9-4d91-bdf4-e3e3af91b76f","Type":"ContainerStarted","Data":"117f55c84897a85f9719b5dc2295419e6ba6f653aae31e7ebbb182d01245ec83"} Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.918764 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" event={"ID":"02720ef5-c00c-48a4-af96-4d82f28bf051","Type":"ContainerStarted","Data":"52fb1f372cce43921564cb5b7b6891588471eff5f1e5138b78569858906e51c2"} Mar 19 09:43:56 crc kubenswrapper[4835]: E0319 09:43:56.920582 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.920616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" event={"ID":"65b9e1e9-a183-4d37-a281-593752da6125","Type":"ContainerStarted","Data":"9c26b29e1fdf253cd742ad4a22fd50b4ae8194b24d847edfdf3976a51632dd67"} Mar 19 09:43:56 crc kubenswrapper[4835]: E0319 09:43:56.926972 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.182:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.929027 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" event={"ID":"74161dd4-df25-4f1b-8924-c9086688463d","Type":"ContainerStarted","Data":"af4961403515d2a587f5ccd918986331f968c44b0709f0e67c11e79b2700f3f9"} Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.932373 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" event={"ID":"e86ab409-7643-4e6f-9129-6f90e5b6bf1c","Type":"ContainerStarted","Data":"3377eeec82652625b414d2481a67a3b8325cc31ef9b72cbc815b836408d76769"} Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.943258 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" event={"ID":"5ab66f1b-4361-44ac-a885-93bbb8cd51d5","Type":"ContainerStarted","Data":"058b3a3d65fefe13ab6b700f275f0d6ebf8678b238f3d28e8db8444b305cae38"} Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.960480 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" event={"ID":"91c2a119-b77e-4cf1-9be3-779c47d4643b","Type":"ContainerStarted","Data":"689101dd88a057ece3e4999641a43a5364fcb266dd13db2b2d9a58107f223679"} Mar 19 09:43:56 crc kubenswrapper[4835]: I0319 09:43:56.965937 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" event={"ID":"a59c67ae-bcf7-404f-ad32-233f38450f65","Type":"ContainerStarted","Data":"941674d40edb75870633b9a7f1a4186544f31a54e19d7e303047b9d8efb0802a"} Mar 19 09:43:56 crc kubenswrapper[4835]: E0319 09:43:56.968118 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.032380 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.032475 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:01.032454889 +0000 UTC m=+1295.881053476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: I0319 09:43:57.032144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:43:57 crc kubenswrapper[4835]: I0319 09:43:57.548527 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.549288 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.549364 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert podName:017e5e75-5952-4240-8349-d16c367d4bed nodeName:}" failed. No retries permitted until 2026-03-19 09:44:01.549342246 +0000 UTC m=+1296.397940833 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" (UID: "017e5e75-5952-4240-8349-d16c367d4bed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: I0319 09:43:57.853774 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:57 crc kubenswrapper[4835]: I0319 09:43:57.853890 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.854861 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.854911 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:01.854897918 +0000 UTC m=+1296.703496505 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "webhook-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.855196 4835 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.855218 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:01.855211007 +0000 UTC m=+1296.703809594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "metrics-server-cert" not found Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.976877 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:6c837f09c0f3246b28931fcd0758f667ca596999558d025e06fc7b7611edec1a\\\"\"" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.977956 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.182:5001/openstack-k8s-operators/telemetry-operator:6bc83971c00395e861fa1e85887e6426f7be6267\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" Mar 19 09:43:57 crc kubenswrapper[4835]: E0319 09:43:57.978016 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.140857 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565224-v6z65"] Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.143545 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.147284 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.147603 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.147647 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.150246 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565224-v6z65"] Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.199323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbf25\" (UniqueName: \"kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25\") pod \"auto-csr-approver-29565224-v6z65\" (UID: \"9e9b9bfc-6170-489b-a1b6-702e2904705b\") " pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.301646 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbf25\" (UniqueName: \"kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25\") pod \"auto-csr-approver-29565224-v6z65\" (UID: \"9e9b9bfc-6170-489b-a1b6-702e2904705b\") " pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.333982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbf25\" (UniqueName: \"kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25\") pod \"auto-csr-approver-29565224-v6z65\" (UID: \"9e9b9bfc-6170-489b-a1b6-702e2904705b\") " pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:00 crc kubenswrapper[4835]: I0319 09:44:00.495437 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:01 crc kubenswrapper[4835]: I0319 09:44:01.120016 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.120369 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.120509 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:09.120472274 +0000 UTC m=+1303.969070901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: I0319 09:44:01.628200 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.628417 4835 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.628807 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert podName:017e5e75-5952-4240-8349-d16c367d4bed nodeName:}" failed. No retries permitted until 2026-03-19 09:44:09.628778509 +0000 UTC m=+1304.477377086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" (UID: "017e5e75-5952-4240-8349-d16c367d4bed") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: I0319 09:44:01.933823 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.934049 4835 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: I0319 09:44:01.934059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.934145 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:09.934121006 +0000 UTC m=+1304.782719603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "webhook-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.934196 4835 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 09:44:01 crc kubenswrapper[4835]: E0319 09:44:01.934334 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs podName:8a05d0dc-7092-45aa-b869-34a23cb0a1f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:09.934316052 +0000 UTC m=+1304.782914649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs") pod "openstack-operator-controller-manager-68f9d5b675-fbgr4" (UID: "8a05d0dc-7092-45aa-b869-34a23cb0a1f9") : secret "metrics-server-cert" not found Mar 19 09:44:08 crc kubenswrapper[4835]: E0319 09:44:08.325030 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552" Mar 19 09:44:08 crc kubenswrapper[4835]: E0319 09:44:08.325829 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-28h9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-bc5c78db9-wvb5n_openstack-operators(2173f655-b411-4dfe-8247-59ac02942c25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:08 crc kubenswrapper[4835]: E0319 09:44:08.326937 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" Mar 19 09:44:09 crc kubenswrapper[4835]: E0319 09:44:09.076651 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:56a4ec82efbed56683a95dd80854da49106f82b909ce3cb1eab9eaffe0e30552\\\"\"" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.179293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:09 crc kubenswrapper[4835]: E0319 09:44:09.179454 4835 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 09:44:09 crc kubenswrapper[4835]: E0319 09:44:09.179684 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert podName:5ea1b1ba-826f-4abe-9c56-caedc3a178f9 nodeName:}" failed. No retries permitted until 2026-03-19 09:44:25.179669947 +0000 UTC m=+1320.028268534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert") pod "infra-operator-controller-manager-5595c7d6ff-zmwm8" (UID: "5ea1b1ba-826f-4abe-9c56-caedc3a178f9") : secret "infra-operator-webhook-server-cert" not found Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.688217 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.698117 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/017e5e75-5952-4240-8349-d16c367d4bed-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-575wd\" (UID: \"017e5e75-5952-4240-8349-d16c367d4bed\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.750254 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.996552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:09 crc kubenswrapper[4835]: I0319 09:44:09.996695 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:10 crc kubenswrapper[4835]: I0319 09:44:10.000306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-webhook-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:10 crc kubenswrapper[4835]: I0319 09:44:10.023197 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a05d0dc-7092-45aa-b869-34a23cb0a1f9-metrics-certs\") pod \"openstack-operator-controller-manager-68f9d5b675-fbgr4\" (UID: \"8a05d0dc-7092-45aa-b869-34a23cb0a1f9\") " pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:10 crc kubenswrapper[4835]: I0319 09:44:10.204983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:10 crc kubenswrapper[4835]: E0319 09:44:10.383469 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258" Mar 19 09:44:10 crc kubenswrapper[4835]: E0319 09:44:10.383795 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2m6ds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-76b87776c9-79r7z_openstack-operators(0f2be57b-1212-4dc2-b772-3317bcd69aac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:10 crc kubenswrapper[4835]: E0319 09:44:10.385032 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podUID="0f2be57b-1212-4dc2-b772-3317bcd69aac" Mar 19 09:44:11 crc kubenswrapper[4835]: E0319 09:44:11.091828 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:e0de6d1ce11f966d1fe774e78fea18cec82c4b859b012a7c6eb4a49d4fcbd258\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podUID="0f2be57b-1212-4dc2-b772-3317bcd69aac" Mar 19 09:44:11 crc kubenswrapper[4835]: E0319 09:44:11.194836 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a" Mar 19 09:44:11 crc kubenswrapper[4835]: E0319 09:44:11.195040 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ft2rb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-7d559dcdbd-9rtgc_openstack-operators(9eeda748-4895-4cbc-be03-d1edabf69758): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:11 crc kubenswrapper[4835]: E0319 09:44:11.196415 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" podUID="9eeda748-4895-4cbc-be03-d1edabf69758" Mar 19 09:44:12 crc kubenswrapper[4835]: E0319 09:44:12.103813 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:d90f208c9239ed14cf538cf9784a6ad021fa0c86d0a4b6ae4ccd5ec851daf27a\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" podUID="9eeda748-4895-4cbc-be03-d1edabf69758" Mar 19 09:44:14 crc kubenswrapper[4835]: E0319 09:44:14.847671 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1" Mar 19 09:44:14 crc kubenswrapper[4835]: E0319 09:44:14.848511 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5brxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6b77b7676d-mgjjt_openstack-operators(aea377c2-d271-45c9-a574-0d1fe89caac5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:14 crc kubenswrapper[4835]: E0319 09:44:14.849855 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" Mar 19 09:44:15 crc kubenswrapper[4835]: E0319 09:44:15.359568 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:d35772c19f96660d35a618d95b92d934e75b8473ca52eea5e62c144a69d68ac1\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" Mar 19 09:44:15 crc kubenswrapper[4835]: E0319 09:44:15.501472 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e" Mar 19 09:44:15 crc kubenswrapper[4835]: E0319 09:44:15.501660 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b8s56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-867f54bc44-sgjfn_openstack-operators(637282e5-c56d-49ea-ac96-f299fb3661f2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:15 crc kubenswrapper[4835]: E0319 09:44:15.502999 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" podUID="637282e5-c56d-49ea-ac96-f299fb3661f2" Mar 19 09:44:16 crc kubenswrapper[4835]: E0319 09:44:16.368364 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:0e0d48e3ca53577e20c81a87f0be6b3254c0b8418e3b446b68c8b5849af7213e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" podUID="637282e5-c56d-49ea-ac96-f299fb3661f2" Mar 19 09:44:16 crc kubenswrapper[4835]: E0319 09:44:16.617570 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396" Mar 19 09:44:16 crc kubenswrapper[4835]: E0319 09:44:16.617791 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5hk6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6d77645966-dhr9f_openstack-operators(0b55921b-5660-44b9-abe9-09639766068c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:16 crc kubenswrapper[4835]: E0319 09:44:16.618987 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" Mar 19 09:44:17 crc kubenswrapper[4835]: E0319 09:44:17.376710 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:58c7a088376a952170371a8faf830a4d5586ac3b38d2aaaaf36842a606d9e396\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" Mar 19 09:44:18 crc kubenswrapper[4835]: E0319 09:44:18.998470 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27" Mar 19 09:44:18 crc kubenswrapper[4835]: E0319 09:44:18.998817 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hzks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-s94g8_openstack-operators(91c2a119-b77e-4cf1-9be3-779c47d4643b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:19 crc kubenswrapper[4835]: E0319 09:44:19.000092 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" Mar 19 09:44:19 crc kubenswrapper[4835]: E0319 09:44:19.397314 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" Mar 19 09:44:19 crc kubenswrapper[4835]: E0319 09:44:19.639152 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a" Mar 19 09:44:19 crc kubenswrapper[4835]: E0319 09:44:19.639370 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dm944,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-fbf7bbb96-v6mc6_openstack-operators(a73c4956-ad63-40d9-907e-599172ac3771): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:19 crc kubenswrapper[4835]: E0319 09:44:19.640851 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" Mar 19 09:44:20 crc kubenswrapper[4835]: E0319 09:44:20.403303 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:d9c086e2bb020a35ce4b7e4943a198808181e8f40785ab588e7904999e82885a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.108613 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.109008 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jccwd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-846c4cdcb7-7b7zt_openstack-operators(74161dd4-df25-4f1b-8924-c9086688463d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.110322 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.410928 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8fc146e6a8704846a36a440a636cd36bec5563abcb5f138b651e2522f0b57702\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.818188 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.818703 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hk64p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6744dd545c-4nqvq_openstack-operators(2e429fe7-b053-43e9-a640-f14af7094e62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:21 crc kubenswrapper[4835]: E0319 09:44:21.820025 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" Mar 19 09:44:22 crc kubenswrapper[4835]: E0319 09:44:22.433312 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:45611eb2b721d1e59ac25f7308fb063e561e8dd81a5824ec5d3952eb066b63f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" Mar 19 09:44:25 crc kubenswrapper[4835]: I0319 09:44:25.228002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:25 crc kubenswrapper[4835]: I0319 09:44:25.239147 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ea1b1ba-826f-4abe-9c56-caedc3a178f9-cert\") pod \"infra-operator-controller-manager-5595c7d6ff-zmwm8\" (UID: \"5ea1b1ba-826f-4abe-9c56-caedc3a178f9\") " pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:25 crc kubenswrapper[4835]: E0319 09:44:25.365805 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1" Mar 19 09:44:25 crc kubenswrapper[4835]: E0319 09:44:25.366480 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ztr45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-66dd9d474d-bjnww_openstack-operators(47fe8883-3cea-4985-8327-4ad721d4e128): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:25 crc kubenswrapper[4835]: E0319 09:44:25.367786 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" Mar 19 09:44:25 crc kubenswrapper[4835]: E0319 09:44:25.467998 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:8cff216ce54922d6d182d9f5dcd0d6bc51d6560e808319c7e20487ee7b6474d1\\\"\"" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" Mar 19 09:44:25 crc kubenswrapper[4835]: I0319 09:44:25.531455 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:27 crc kubenswrapper[4835]: E0319 09:44:27.759604 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953" Mar 19 09:44:27 crc kubenswrapper[4835]: E0319 09:44:27.761514 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4zdl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-74d6f7b5c-64vf7_openstack-operators(a59c67ae-bcf7-404f-ad32-233f38450f65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:27 crc kubenswrapper[4835]: E0319 09:44:27.762920 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" Mar 19 09:44:28 crc kubenswrapper[4835]: E0319 09:44:28.484434 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 19 09:44:28 crc kubenswrapper[4835]: E0319 09:44:28.484920 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9k6cd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-z9wxq_openstack-operators(5ab66f1b-4361-44ac-a885-93bbb8cd51d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:44:28 crc kubenswrapper[4835]: E0319 09:44:28.486981 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" podUID="5ab66f1b-4361-44ac-a885-93bbb8cd51d5" Mar 19 09:44:28 crc kubenswrapper[4835]: I0319 09:44:28.930407 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd"] Mar 19 09:44:28 crc kubenswrapper[4835]: I0319 09:44:28.998331 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565224-v6z65"] Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.183864 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4"] Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.215295 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8"] Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.500399 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" event={"ID":"8a05d0dc-7092-45aa-b869-34a23cb0a1f9","Type":"ContainerStarted","Data":"16731b0a0fdeed1d5106d0fefbb3c4b2f0a9fa7ebf3c24fb0ff3edf19632f21d"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.500451 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" event={"ID":"8a05d0dc-7092-45aa-b869-34a23cb0a1f9","Type":"ContainerStarted","Data":"f56cfb0d31dc7686a364f88baa90b1b4a9d38fb7b8ffdbeb1b0ea7a5185b4916"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.501825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.506675 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" event={"ID":"aea377c2-d271-45c9-a574-0d1fe89caac5","Type":"ContainerStarted","Data":"fdf008e19be6d17f3e099d10d222ca861670ce7f4a5453dbee36dc463e90e54e"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.506926 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.507553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" event={"ID":"5ea1b1ba-826f-4abe-9c56-caedc3a178f9","Type":"ContainerStarted","Data":"ef2dd20f0303ad08e38ecc50a5b501c7c90ef0d54d7dc74a02bb003dbde3a3e4"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.509316 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" event={"ID":"65b9e1e9-a183-4d37-a281-593752da6125","Type":"ContainerStarted","Data":"babe87d07b1334c85ab2c7fb17d06175cb07ab1a3db69b5a801efcd3f718adea"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.510050 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.513844 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" event={"ID":"e86ab409-7643-4e6f-9129-6f90e5b6bf1c","Type":"ContainerStarted","Data":"fcfcc6c8e3a98c192076129884709f9f2c5bc5b8a48cad0e16e3540c175fb091"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.514374 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.517197 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" event={"ID":"e01500c9-ffdf-42df-9017-26b813efed0e","Type":"ContainerStarted","Data":"1a22dca53136ef072e21016dc18dcde618df3121535ce83d70026b29e340a5cc"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.517694 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.525983 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" event={"ID":"9eeda748-4895-4cbc-be03-d1edabf69758","Type":"ContainerStarted","Data":"1033308292ba8a1017219168fa2a64e2d2f7c26be02db8c98151afa264d60681"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.526429 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.531728 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565224-v6z65" event={"ID":"9e9b9bfc-6170-489b-a1b6-702e2904705b","Type":"ContainerStarted","Data":"52ab8353b52c7c8ecac17a590b775f8b5d52c082b9f0bf246cd4e666009f31fc"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.532874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" event={"ID":"017e5e75-5952-4240-8349-d16c367d4bed","Type":"ContainerStarted","Data":"6f79e7884e51cdddc20ad15440f3270cecf2d713113cb55c41c8080767ce41c0"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.534924 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" event={"ID":"dd4067de-94b9-4d91-bdf4-e3e3af91b76f","Type":"ContainerStarted","Data":"a980a9268411dc2e96ccd72574d6f2d1f58e717211e75bb14ef2e300d69d03d8"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.535610 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.544758 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" event={"ID":"02720ef5-c00c-48a4-af96-4d82f28bf051","Type":"ContainerStarted","Data":"551d7a69f920ceb812cb59bf0afd3ae2dee3dad852d4f14fe5f6f6ec04acbaf8"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.545462 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.550203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" event={"ID":"6dabbeda-38a8-4289-92b6-075d7c20f958","Type":"ContainerStarted","Data":"b672bdc9e7e1600732aae7beaba4ce820773cefd624e33e9706a71ea3ee64731"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.550872 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.552417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" event={"ID":"5db66af1-4929-4163-b788-0320b0323a79","Type":"ContainerStarted","Data":"c96210b6afd2f53d17c55ee5dc1a9924b832bac0cd12250428b536a1788c0f4d"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.552933 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.554354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" event={"ID":"2173f655-b411-4dfe-8247-59ac02942c25","Type":"ContainerStarted","Data":"dc28eea2e9fd4f8640a61fb4c8244f122e759b21c45b1c23b29c1431335fb2ce"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.554823 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.556304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" event={"ID":"0f2be57b-1212-4dc2-b772-3317bcd69aac","Type":"ContainerStarted","Data":"515afc610d5f88677d2ff58c2ed75a356b8127da39f5e54f7227fc41b8291910"} Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.556681 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:44:29 crc kubenswrapper[4835]: E0319 09:44:29.557248 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" podUID="5ab66f1b-4361-44ac-a885-93bbb8cd51d5" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.634026 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" podStartSLOduration=36.634009239 podStartE2EDuration="36.634009239s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:44:29.631680837 +0000 UTC m=+1324.480279424" watchObservedRunningTime="2026-03-19 09:44:29.634009239 +0000 UTC m=+1324.482607826" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.734071 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" podStartSLOduration=4.920326835 podStartE2EDuration="36.734053105s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.940311691 +0000 UTC m=+1290.788910278" lastFinishedPulling="2026-03-19 09:44:27.754037951 +0000 UTC m=+1322.602636548" observedRunningTime="2026-03-19 09:44:29.688755859 +0000 UTC m=+1324.537354446" watchObservedRunningTime="2026-03-19 09:44:29.734053105 +0000 UTC m=+1324.582651692" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.771279 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" podStartSLOduration=11.442024947 podStartE2EDuration="37.771262994s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:54.786196849 +0000 UTC m=+1289.634795436" lastFinishedPulling="2026-03-19 09:44:21.115434846 +0000 UTC m=+1315.964033483" observedRunningTime="2026-03-19 09:44:29.76141029 +0000 UTC m=+1324.610008887" watchObservedRunningTime="2026-03-19 09:44:29.771262994 +0000 UTC m=+1324.619861581" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.772429 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" podStartSLOduration=11.28791634 podStartE2EDuration="37.772421665s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:54.630544 +0000 UTC m=+1289.479142587" lastFinishedPulling="2026-03-19 09:44:21.115049295 +0000 UTC m=+1315.963647912" observedRunningTime="2026-03-19 09:44:29.734574229 +0000 UTC m=+1324.583172816" watchObservedRunningTime="2026-03-19 09:44:29.772421665 +0000 UTC m=+1324.621020252" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.794174 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podStartSLOduration=3.349924266 podStartE2EDuration="36.794158258s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.109706293 +0000 UTC m=+1289.958304880" lastFinishedPulling="2026-03-19 09:44:28.553940285 +0000 UTC m=+1323.402538872" observedRunningTime="2026-03-19 09:44:29.792468894 +0000 UTC m=+1324.641067481" watchObservedRunningTime="2026-03-19 09:44:29.794158258 +0000 UTC m=+1324.642756845" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.872575 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podStartSLOduration=4.362444373 podStartE2EDuration="37.872555013s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.113559996 +0000 UTC m=+1289.962158583" lastFinishedPulling="2026-03-19 09:44:28.623670626 +0000 UTC m=+1323.472269223" observedRunningTime="2026-03-19 09:44:29.839648839 +0000 UTC m=+1324.688247426" watchObservedRunningTime="2026-03-19 09:44:29.872555013 +0000 UTC m=+1324.721153600" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.876440 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podStartSLOduration=3.46256571 podStartE2EDuration="36.876424747s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.138262639 +0000 UTC m=+1289.986861216" lastFinishedPulling="2026-03-19 09:44:28.552121666 +0000 UTC m=+1323.400720253" observedRunningTime="2026-03-19 09:44:29.869496291 +0000 UTC m=+1324.718094888" watchObservedRunningTime="2026-03-19 09:44:29.876424747 +0000 UTC m=+1324.725023334" Mar 19 09:44:29 crc kubenswrapper[4835]: I0319 09:44:29.976281 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podStartSLOduration=4.453505493 podStartE2EDuration="36.976258667s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.954648376 +0000 UTC m=+1290.803246963" lastFinishedPulling="2026-03-19 09:44:28.47740154 +0000 UTC m=+1323.326000137" observedRunningTime="2026-03-19 09:44:29.940423225 +0000 UTC m=+1324.789021812" watchObservedRunningTime="2026-03-19 09:44:29.976258667 +0000 UTC m=+1324.824857264" Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.014667 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podStartSLOduration=4.416823738 podStartE2EDuration="37.014648038s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.956247738 +0000 UTC m=+1290.804846335" lastFinishedPulling="2026-03-19 09:44:28.554072048 +0000 UTC m=+1323.402670635" observedRunningTime="2026-03-19 09:44:29.97822875 +0000 UTC m=+1324.826827337" watchObservedRunningTime="2026-03-19 09:44:30.014648038 +0000 UTC m=+1324.863246625" Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.035436 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" podStartSLOduration=12.871416778 podStartE2EDuration="38.035418545s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:54.466599668 +0000 UTC m=+1289.315198255" lastFinishedPulling="2026-03-19 09:44:19.630601435 +0000 UTC m=+1314.479200022" observedRunningTime="2026-03-19 09:44:30.017247088 +0000 UTC m=+1324.865845665" watchObservedRunningTime="2026-03-19 09:44:30.035418545 +0000 UTC m=+1324.884017132" Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.037706 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" podStartSLOduration=3.857740274 podStartE2EDuration="38.037697386s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:54.442088151 +0000 UTC m=+1289.290686738" lastFinishedPulling="2026-03-19 09:44:28.622045263 +0000 UTC m=+1323.470643850" observedRunningTime="2026-03-19 09:44:30.034673765 +0000 UTC m=+1324.883272352" watchObservedRunningTime="2026-03-19 09:44:30.037697386 +0000 UTC m=+1324.886295973" Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.075702 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podStartSLOduration=7.659021007 podStartE2EDuration="37.075685767s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.938595415 +0000 UTC m=+1290.787194002" lastFinishedPulling="2026-03-19 09:44:25.355260135 +0000 UTC m=+1320.203858762" observedRunningTime="2026-03-19 09:44:30.072925182 +0000 UTC m=+1324.921523759" watchObservedRunningTime="2026-03-19 09:44:30.075685767 +0000 UTC m=+1324.924284344" Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.565337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" event={"ID":"637282e5-c56d-49ea-ac96-f299fb3661f2","Type":"ContainerStarted","Data":"cf1ee957648799c6399f580438d04fb9ba811d580e431afca5152d27852ca593"} Mar 19 09:44:30 crc kubenswrapper[4835]: I0319 09:44:30.583329 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" podStartSLOduration=3.070370232 podStartE2EDuration="37.583309074s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.745243064 +0000 UTC m=+1290.593841651" lastFinishedPulling="2026-03-19 09:44:30.258181906 +0000 UTC m=+1325.106780493" observedRunningTime="2026-03-19 09:44:30.57795892 +0000 UTC m=+1325.426557517" watchObservedRunningTime="2026-03-19 09:44:30.583309074 +0000 UTC m=+1325.431907661" Mar 19 09:44:31 crc kubenswrapper[4835]: I0319 09:44:31.584555 4835 generic.go:334] "Generic (PLEG): container finished" podID="9e9b9bfc-6170-489b-a1b6-702e2904705b" containerID="0764a3055bee00d259054081e109654843a781a4f83ab67185d4d6c8e3e3f6c3" exitCode=0 Mar 19 09:44:31 crc kubenswrapper[4835]: I0319 09:44:31.586681 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565224-v6z65" event={"ID":"9e9b9bfc-6170-489b-a1b6-702e2904705b","Type":"ContainerDied","Data":"0764a3055bee00d259054081e109654843a781a4f83ab67185d4d6c8e3e3f6c3"} Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.152595 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.287861 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.391787 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.402187 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbf25\" (UniqueName: \"kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25\") pod \"9e9b9bfc-6170-489b-a1b6-702e2904705b\" (UID: \"9e9b9bfc-6170-489b-a1b6-702e2904705b\") " Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.422766 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25" (OuterVolumeSpecName: "kube-api-access-vbf25") pod "9e9b9bfc-6170-489b-a1b6-702e2904705b" (UID: "9e9b9bfc-6170-489b-a1b6-702e2904705b"). InnerVolumeSpecName "kube-api-access-vbf25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.506516 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbf25\" (UniqueName: \"kubernetes.io/projected/9e9b9bfc-6170-489b-a1b6-702e2904705b-kube-api-access-vbf25\") on node \"crc\" DevicePath \"\"" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.601834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565224-v6z65" event={"ID":"9e9b9bfc-6170-489b-a1b6-702e2904705b","Type":"ContainerDied","Data":"52ab8353b52c7c8ecac17a590b775f8b5d52c082b9f0bf246cd4e666009f31fc"} Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.601874 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565224-v6z65" Mar 19 09:44:33 crc kubenswrapper[4835]: I0319 09:44:33.601883 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ab8353b52c7c8ecac17a590b775f8b5d52c082b9f0bf246cd4e666009f31fc" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.120471 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.171460 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.332027 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.375926 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565218-5w5qk"] Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.383601 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565218-5w5qk"] Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.422339 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2" path="/var/lib/kubelet/pods/ab247fa5-c2c2-4a53-9ab5-a16ef485d1b2/volumes" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.610334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" event={"ID":"2e429fe7-b053-43e9-a640-f14af7094e62","Type":"ContainerStarted","Data":"d6709cea326e6f63e4449bda6513150621ea2c5bede53a31e005c70b23af7142"} Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.610556 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.611559 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" event={"ID":"a73c4956-ad63-40d9-907e-599172ac3771","Type":"ContainerStarted","Data":"17ad10f3b950a76f708bd259f3a08d183d19048e9412307a6c2f5934307e89e0"} Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.611680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.613132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" event={"ID":"0b55921b-5660-44b9-abe9-09639766068c","Type":"ContainerStarted","Data":"59e6c9970ac2194ea4a706da403a7ba18b44e690fac9002cbd19cb0e0872a7de"} Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.613288 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.615271 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" event={"ID":"017e5e75-5952-4240-8349-d16c367d4bed","Type":"ContainerStarted","Data":"e8fdfb183ab22433c247fccbc92d1721ed9d356b5b086cc36de6b63873d1e6cd"} Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.615336 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.617018 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" event={"ID":"5ea1b1ba-826f-4abe-9c56-caedc3a178f9","Type":"ContainerStarted","Data":"aac0ac859b91fe071f9299203ebcbd60233ee2e76eef22a47b985905cbf3dbb9"} Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.617157 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.629533 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podStartSLOduration=3.276265949 podStartE2EDuration="41.629516215s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.75924923 +0000 UTC m=+1290.607847817" lastFinishedPulling="2026-03-19 09:44:34.112499496 +0000 UTC m=+1328.961098083" observedRunningTime="2026-03-19 09:44:34.626166295 +0000 UTC m=+1329.474764892" watchObservedRunningTime="2026-03-19 09:44:34.629516215 +0000 UTC m=+1329.478114802" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.652658 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podStartSLOduration=2.958251217 podStartE2EDuration="42.652638716s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:54.418424095 +0000 UTC m=+1289.267022682" lastFinishedPulling="2026-03-19 09:44:34.112811604 +0000 UTC m=+1328.961410181" observedRunningTime="2026-03-19 09:44:34.649825311 +0000 UTC m=+1329.498423898" watchObservedRunningTime="2026-03-19 09:44:34.652638716 +0000 UTC m=+1329.501237303" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.676419 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podStartSLOduration=36.535483015 podStartE2EDuration="41.676399164s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:44:28.971558576 +0000 UTC m=+1323.820157163" lastFinishedPulling="2026-03-19 09:44:34.112474715 +0000 UTC m=+1328.961073312" observedRunningTime="2026-03-19 09:44:34.671393279 +0000 UTC m=+1329.519991866" watchObservedRunningTime="2026-03-19 09:44:34.676399164 +0000 UTC m=+1329.524997751" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.706204 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podStartSLOduration=2.7175149210000002 podStartE2EDuration="41.706181634s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.122893877 +0000 UTC m=+1289.971492464" lastFinishedPulling="2026-03-19 09:44:34.11156059 +0000 UTC m=+1328.960159177" observedRunningTime="2026-03-19 09:44:34.701394585 +0000 UTC m=+1329.549993212" watchObservedRunningTime="2026-03-19 09:44:34.706181634 +0000 UTC m=+1329.554780221" Mar 19 09:44:34 crc kubenswrapper[4835]: I0319 09:44:34.736966 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podStartSLOduration=37.858385254 podStartE2EDuration="42.73694924s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:44:29.234942467 +0000 UTC m=+1324.083541054" lastFinishedPulling="2026-03-19 09:44:34.113506453 +0000 UTC m=+1328.962105040" observedRunningTime="2026-03-19 09:44:34.734690599 +0000 UTC m=+1329.583289196" watchObservedRunningTime="2026-03-19 09:44:34.73694924 +0000 UTC m=+1329.585547817" Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.635986 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" event={"ID":"74161dd4-df25-4f1b-8924-c9086688463d","Type":"ContainerStarted","Data":"0aebdb6c0df273f2f54019c80d85a5160e2ba0d90a3e1ea31d35b7e5c96a3c44"} Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.636903 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.637765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" event={"ID":"91c2a119-b77e-4cf1-9be3-779c47d4643b","Type":"ContainerStarted","Data":"76581d1a30079a96dba301021727e99c84828ba5ba767d4953b0a8ea4eae8312"} Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.637982 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.657359 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podStartSLOduration=3.437045626 podStartE2EDuration="43.657343403s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.952282512 +0000 UTC m=+1290.800881099" lastFinishedPulling="2026-03-19 09:44:36.172580289 +0000 UTC m=+1331.021178876" observedRunningTime="2026-03-19 09:44:36.657265491 +0000 UTC m=+1331.505864088" watchObservedRunningTime="2026-03-19 09:44:36.657343403 +0000 UTC m=+1331.505941990" Mar 19 09:44:36 crc kubenswrapper[4835]: I0319 09:44:36.686456 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podStartSLOduration=3.627968731 podStartE2EDuration="43.686429084s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.945017847 +0000 UTC m=+1290.793616434" lastFinishedPulling="2026-03-19 09:44:36.0034782 +0000 UTC m=+1330.852076787" observedRunningTime="2026-03-19 09:44:36.672349596 +0000 UTC m=+1331.520948193" watchObservedRunningTime="2026-03-19 09:44:36.686429084 +0000 UTC m=+1331.535027671" Mar 19 09:44:39 crc kubenswrapper[4835]: I0319 09:44:39.663263 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" event={"ID":"47fe8883-3cea-4985-8327-4ad721d4e128","Type":"ContainerStarted","Data":"a096021ad54fefd5e2f6ec829b64fe65e99370e22cb444a589c4d30d413c88a0"} Mar 19 09:44:39 crc kubenswrapper[4835]: I0319 09:44:39.664733 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:44:39 crc kubenswrapper[4835]: I0319 09:44:39.756913 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 09:44:39 crc kubenswrapper[4835]: I0319 09:44:39.787815 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podStartSLOduration=3.8456793019999997 podStartE2EDuration="47.78772225s" podCreationTimestamp="2026-03-19 09:43:52 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.128049016 +0000 UTC m=+1289.976647603" lastFinishedPulling="2026-03-19 09:44:39.070091954 +0000 UTC m=+1333.918690551" observedRunningTime="2026-03-19 09:44:39.682171396 +0000 UTC m=+1334.530769993" watchObservedRunningTime="2026-03-19 09:44:39.78772225 +0000 UTC m=+1334.636320837" Mar 19 09:44:40 crc kubenswrapper[4835]: I0319 09:44:40.211904 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" Mar 19 09:44:42 crc kubenswrapper[4835]: E0319 09:44:42.403442 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:05a6fd95f5a1472c74e40b3efe58ff423cc2a00e745eea6dea19f622ef2c0953\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.296307 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.306852 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.507552 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.677366 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.710647 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.774627 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.822950 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.873463 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.923339 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 09:44:43 crc kubenswrapper[4835]: I0319 09:44:43.973949 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 09:44:44 crc kubenswrapper[4835]: I0319 09:44:44.088495 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" Mar 19 09:44:44 crc kubenswrapper[4835]: I0319 09:44:44.174325 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" Mar 19 09:44:44 crc kubenswrapper[4835]: I0319 09:44:44.254325 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 09:44:45 crc kubenswrapper[4835]: I0319 09:44:45.540494 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 09:44:45 crc kubenswrapper[4835]: I0319 09:44:45.715813 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" event={"ID":"5ab66f1b-4361-44ac-a885-93bbb8cd51d5","Type":"ContainerStarted","Data":"140bebbc38bf618ec43d7fa8f52b8ccbc39bbf53a8c4bb1d846dabe4c56953c4"} Mar 19 09:44:45 crc kubenswrapper[4835]: I0319 09:44:45.737576 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z9wxq" podStartSLOduration=3.883606342 podStartE2EDuration="52.737556674s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.951332756 +0000 UTC m=+1290.799931343" lastFinishedPulling="2026-03-19 09:44:44.805283088 +0000 UTC m=+1339.653881675" observedRunningTime="2026-03-19 09:44:45.728420649 +0000 UTC m=+1340.577019266" watchObservedRunningTime="2026-03-19 09:44:45.737556674 +0000 UTC m=+1340.586155251" Mar 19 09:44:53 crc kubenswrapper[4835]: I0319 09:44:53.624934 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 09:44:58 crc kubenswrapper[4835]: I0319 09:44:58.862732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" event={"ID":"a59c67ae-bcf7-404f-ad32-233f38450f65","Type":"ContainerStarted","Data":"d921dd15d39ad98ecddd060299ea370f29aa6bf60b14104dd86fa5ca32392313"} Mar 19 09:44:58 crc kubenswrapper[4835]: I0319 09:44:58.863615 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:44:58 crc kubenswrapper[4835]: I0319 09:44:58.896285 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podStartSLOduration=3.778046419 podStartE2EDuration="1m5.896249465s" podCreationTimestamp="2026-03-19 09:43:53 +0000 UTC" firstStartedPulling="2026-03-19 09:43:55.956784803 +0000 UTC m=+1290.805383390" lastFinishedPulling="2026-03-19 09:44:58.074987849 +0000 UTC m=+1352.923586436" observedRunningTime="2026-03-19 09:44:58.891112747 +0000 UTC m=+1353.739711424" watchObservedRunningTime="2026-03-19 09:44:58.896249465 +0000 UTC m=+1353.744848102" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.165015 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp"] Mar 19 09:45:00 crc kubenswrapper[4835]: E0319 09:45:00.165565 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e9b9bfc-6170-489b-a1b6-702e2904705b" containerName="oc" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.165588 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e9b9bfc-6170-489b-a1b6-702e2904705b" containerName="oc" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.165897 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e9b9bfc-6170-489b-a1b6-702e2904705b" containerName="oc" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.166872 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.169365 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.169423 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.176048 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp"] Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.188082 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.188234 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.188310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2ldg\" (UniqueName: \"kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.289953 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.290334 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.290485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2ldg\" (UniqueName: \"kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.291571 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.298280 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.308977 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2ldg\" (UniqueName: \"kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg\") pod \"collect-profiles-29565225-jxftp\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.505633 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:00 crc kubenswrapper[4835]: I0319 09:45:00.952956 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp"] Mar 19 09:45:01 crc kubenswrapper[4835]: I0319 09:45:01.898146 4835 generic.go:334] "Generic (PLEG): container finished" podID="07d4f403-4f7a-408c-bc73-1899c4147d57" containerID="50bfbca42fdf77c3d9cfc2799a3d230aeec3b5d7d4e83d5ff0054eaf3d1d4324" exitCode=0 Mar 19 09:45:01 crc kubenswrapper[4835]: I0319 09:45:01.898219 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" event={"ID":"07d4f403-4f7a-408c-bc73-1899c4147d57","Type":"ContainerDied","Data":"50bfbca42fdf77c3d9cfc2799a3d230aeec3b5d7d4e83d5ff0054eaf3d1d4324"} Mar 19 09:45:01 crc kubenswrapper[4835]: I0319 09:45:01.899015 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" event={"ID":"07d4f403-4f7a-408c-bc73-1899c4147d57","Type":"ContainerStarted","Data":"385fb23ec4742ca636918ca4371e5f7991c0e1144ae5aea5176973fd788635a5"} Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.248205 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.449114 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume\") pod \"07d4f403-4f7a-408c-bc73-1899c4147d57\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.449193 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2ldg\" (UniqueName: \"kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg\") pod \"07d4f403-4f7a-408c-bc73-1899c4147d57\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.449263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume\") pod \"07d4f403-4f7a-408c-bc73-1899c4147d57\" (UID: \"07d4f403-4f7a-408c-bc73-1899c4147d57\") " Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.449848 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume" (OuterVolumeSpecName: "config-volume") pod "07d4f403-4f7a-408c-bc73-1899c4147d57" (UID: "07d4f403-4f7a-408c-bc73-1899c4147d57"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.454480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg" (OuterVolumeSpecName: "kube-api-access-s2ldg") pod "07d4f403-4f7a-408c-bc73-1899c4147d57" (UID: "07d4f403-4f7a-408c-bc73-1899c4147d57"). InnerVolumeSpecName "kube-api-access-s2ldg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.460480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07d4f403-4f7a-408c-bc73-1899c4147d57" (UID: "07d4f403-4f7a-408c-bc73-1899c4147d57"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.551030 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07d4f403-4f7a-408c-bc73-1899c4147d57-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.551064 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2ldg\" (UniqueName: \"kubernetes.io/projected/07d4f403-4f7a-408c-bc73-1899c4147d57-kube-api-access-s2ldg\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.551074 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07d4f403-4f7a-408c-bc73-1899c4147d57-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.916401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" event={"ID":"07d4f403-4f7a-408c-bc73-1899c4147d57","Type":"ContainerDied","Data":"385fb23ec4742ca636918ca4371e5f7991c0e1144ae5aea5176973fd788635a5"} Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.916444 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385fb23ec4742ca636918ca4371e5f7991c0e1144ae5aea5176973fd788635a5" Mar 19 09:45:03 crc kubenswrapper[4835]: I0319 09:45:03.916465 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp" Mar 19 09:45:04 crc kubenswrapper[4835]: I0319 09:45:04.294221 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 09:45:06 crc kubenswrapper[4835]: I0319 09:45:06.422493 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:45:06 crc kubenswrapper[4835]: I0319 09:45:06.422578 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.454734 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:25 crc kubenswrapper[4835]: E0319 09:45:25.455565 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d4f403-4f7a-408c-bc73-1899c4147d57" containerName="collect-profiles" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.455578 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d4f403-4f7a-408c-bc73-1899c4147d57" containerName="collect-profiles" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.455795 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d4f403-4f7a-408c-bc73-1899c4147d57" containerName="collect-profiles" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.456693 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.459075 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.459215 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.459206 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.469981 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4829b" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.480093 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.527505 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.528872 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.532213 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.549858 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.573470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcqlh\" (UniqueName: \"kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.573586 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.674996 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.675089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.675130 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcqlh\" (UniqueName: \"kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.675155 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsgt\" (UniqueName: \"kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.675381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.676279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.697366 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcqlh\" (UniqueName: \"kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh\") pod \"dnsmasq-dns-675f4bcbfc-v4jqm\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.776655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.777902 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.777948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsgt\" (UniqueName: \"kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.778045 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.778949 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.778978 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.811034 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsgt\" (UniqueName: \"kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt\") pod \"dnsmasq-dns-78dd6ddcc-8t84q\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:25 crc kubenswrapper[4835]: I0319 09:45:25.848365 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:26 crc kubenswrapper[4835]: I0319 09:45:26.266960 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:26 crc kubenswrapper[4835]: I0319 09:45:26.981025 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:27 crc kubenswrapper[4835]: I0319 09:45:27.195427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" event={"ID":"09eff517-95bc-4a23-932f-d44c82221963","Type":"ContainerStarted","Data":"1a69809c86f7f3296c610bd3570bed58992f894e26024752e1b6dd601d32e487"} Mar 19 09:45:27 crc kubenswrapper[4835]: I0319 09:45:27.197376 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" event={"ID":"a02e5221-dee0-4444-a734-2d336cc73119","Type":"ContainerStarted","Data":"261b9245aaa37026f1c35e221137c4c1a038863403c2cb331423ac3d10843a05"} Mar 19 09:45:27 crc kubenswrapper[4835]: I0319 09:45:27.620349 4835 scope.go:117] "RemoveContainer" containerID="204778b8018b471746e0832d0c509e15e163f11dd502ece7270d0707d5ce24b9" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.218130 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.237310 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.238814 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.270474 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.334676 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzr72\" (UniqueName: \"kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.334726 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.334770 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.436294 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzr72\" (UniqueName: \"kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.436615 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.436642 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.437377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.437440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.462172 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzr72\" (UniqueName: \"kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72\") pod \"dnsmasq-dns-5ccc8479f9-pnmhk\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.510386 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.532866 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.540065 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.565360 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.567129 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.641681 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.641754 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px44t\" (UniqueName: \"kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.641788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.747717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.747796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px44t\" (UniqueName: \"kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.747830 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.748729 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.749291 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.773848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px44t\" (UniqueName: \"kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t\") pod \"dnsmasq-dns-57d769cc4f-q74sf\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:28 crc kubenswrapper[4835]: I0319 09:45:28.861354 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.203314 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:45:29 crc kubenswrapper[4835]: W0319 09:45:29.234669 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9580a501_4d3f_4720_8cd3_67c7b1d5aef1.slice/crio-94b1a266818d0c7c557a90900ae8c09ef4c4fea3b116c20c7066d8c1b2e8b28d WatchSource:0}: Error finding container 94b1a266818d0c7c557a90900ae8c09ef4c4fea3b116c20c7066d8c1b2e8b28d: Status 404 returned error can't find the container with id 94b1a266818d0c7c557a90900ae8c09ef4c4fea3b116c20c7066d8c1b2e8b28d Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.357177 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.360440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.363374 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.363732 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.364176 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-z5r7q" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.364342 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.364792 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.365672 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.365903 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.381693 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.431275 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:45:29 crc kubenswrapper[4835]: W0319 09:45:29.441292 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6fdd390_927b_4c4e_98ce_41d763a0f236.slice/crio-a8dc5fcfff263a23e8eeb833bd8a74d2a39999e2e7a246606d12e4aa5140ac54 WatchSource:0}: Error finding container a8dc5fcfff263a23e8eeb833bd8a74d2a39999e2e7a246606d12e4aa5140ac54: Status 404 returned error can't find the container with id a8dc5fcfff263a23e8eeb833bd8a74d2a39999e2e7a246606d12e4aa5140ac54 Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.469526 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.469566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.469877 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.469980 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqn26\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470044 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470230 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470315 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470481 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.470660 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572186 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572270 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572295 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572326 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572409 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572468 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572564 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqn26\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.572609 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.573121 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.574051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.574181 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.574592 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.576334 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.588155 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.598717 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.598773 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65fb8dd4e3e5bb0348221bd046f190ee81283e3e9c70b8d7f16685b3389c205f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.604411 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.606344 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.607609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.621276 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqn26\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.665157 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.665613 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.668547 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.677408 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678565 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678662 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vjm4s" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678699 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678717 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.678844 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.707382 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.714668 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.731561 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.734518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.743808 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.759023 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781257 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781309 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781340 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781410 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781451 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dg9x\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781511 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781560 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781588 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781614 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781635 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781661 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781719 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781746 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qt75\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781811 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781890 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781916 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.781939 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.791550 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.817581 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883082 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883105 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883171 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883217 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883248 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883303 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dg9x\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883340 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883360 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883380 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883401 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883420 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883479 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qt75\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.883534 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.884141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.886539 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.887868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.888538 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.888606 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.889851 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.891193 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.892432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.893751 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.897274 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.897659 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.898234 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.898336 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.898825 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.898971 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.899744 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.903526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.904202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.911820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dg9x\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.917386 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.917435 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/360c44f69d944d51c872b2f6f2ec804148f8978798dde3b91a4a5d8d6e32ad50/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.918115 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.918166 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/93d6c24dfb86f6a55f7cf65a0ab47acdafa6a0270fb18fc81a7453795079c631/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.918819 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qt75\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.978374 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " pod="openstack/rabbitmq-server-0" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985250 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985361 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985476 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985530 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985621 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkm2t\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985700 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.985765 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.988059 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:29 crc kubenswrapper[4835]: I0319 09:45:29.988088 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.011167 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " pod="openstack/rabbitmq-server-1" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.035063 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.090806 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.090902 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.091765 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.091796 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkm2t\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.091938 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092003 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092090 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.092319 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.093549 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.095352 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.096764 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.097564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.098059 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.098091 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f8512cd517073f0d6f4dcbd40cfe670029840b44a1b34da330f087592cacc87/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.100589 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.101314 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.111204 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.115842 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkm2t\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.116687 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.118653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.148645 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.249725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" event={"ID":"9580a501-4d3f-4720-8cd3-67c7b1d5aef1","Type":"ContainerStarted","Data":"94b1a266818d0c7c557a90900ae8c09ef4c4fea3b116c20c7066d8c1b2e8b28d"} Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.251537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" event={"ID":"a6fdd390-927b-4c4e-98ce-41d763a0f236","Type":"ContainerStarted","Data":"a8dc5fcfff263a23e8eeb833bd8a74d2a39999e2e7a246606d12e4aa5140ac54"} Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.287463 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:45:30 crc kubenswrapper[4835]: W0319 09:45:30.324130 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf773ac48_7b51_427e_9c89_34e515bddabb.slice/crio-197e9b70bf6f20819e4cdaeab5b67b51287cb78bf25bb687c5b05873f57c96fa WatchSource:0}: Error finding container 197e9b70bf6f20819e4cdaeab5b67b51287cb78bf25bb687c5b05873f57c96fa: Status 404 returned error can't find the container with id 197e9b70bf6f20819e4cdaeab5b67b51287cb78bf25bb687c5b05873f57c96fa Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.427264 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.632254 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.713112 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.714823 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.716841 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-k2xqr" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.719132 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.720037 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.720329 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.727328 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.740945 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.813814 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-63903607-a122-4d23-88d2-d9ba6db4513b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63903607-a122-4d23-88d2-d9ba6db4513b\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.813880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.813919 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sbjv\" (UniqueName: \"kubernetes.io/projected/8fd594ac-051b-4142-ba53-e974d9c5daa5-kube-api-access-6sbjv\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.813947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.813977 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.814008 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.814049 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.814079 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.874720 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.915667 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-63903607-a122-4d23-88d2-d9ba6db4513b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63903607-a122-4d23-88d2-d9ba6db4513b\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916169 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sbjv\" (UniqueName: \"kubernetes.io/projected/8fd594ac-051b-4142-ba53-e974d9c5daa5-kube-api-access-6sbjv\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916343 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.916455 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.917806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.918211 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-config-data-default\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.918403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-kolla-config\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.918556 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd594ac-051b-4142-ba53-e974d9c5daa5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.921694 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.921749 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-63903607-a122-4d23-88d2-d9ba6db4513b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63903607-a122-4d23-88d2-d9ba6db4513b\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/901fb620eb5bf0f9a206f7080af29b6d415acd8790ebefb4ac9c6b879f497bfc/globalmount\"" pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.926311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.933448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd594ac-051b-4142-ba53-e974d9c5daa5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:30 crc kubenswrapper[4835]: I0319 09:45:30.952000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sbjv\" (UniqueName: \"kubernetes.io/projected/8fd594ac-051b-4142-ba53-e974d9c5daa5-kube-api-access-6sbjv\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:31 crc kubenswrapper[4835]: I0319 09:45:31.022943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-63903607-a122-4d23-88d2-d9ba6db4513b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-63903607-a122-4d23-88d2-d9ba6db4513b\") pod \"openstack-galera-0\" (UID: \"8fd594ac-051b-4142-ba53-e974d9c5daa5\") " pod="openstack/openstack-galera-0" Mar 19 09:45:31 crc kubenswrapper[4835]: I0319 09:45:31.048357 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 09:45:31 crc kubenswrapper[4835]: I0319 09:45:31.287069 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerStarted","Data":"197e9b70bf6f20819e4cdaeab5b67b51287cb78bf25bb687c5b05873f57c96fa"} Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.124409 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.128196 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.131951 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-b7rmw" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.132146 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.132174 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.132275 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.132440 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271295 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw5bc\" (UniqueName: \"kubernetes.io/projected/fdee895b-c681-4d66-bd0d-08622d97cbea-kube-api-access-mw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271516 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271576 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271629 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271792 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271891 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.271972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373653 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373724 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw5bc\" (UniqueName: \"kubernetes.io/projected/fdee895b-c681-4d66-bd0d-08622d97cbea-kube-api-access-mw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373781 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373818 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373854 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.373916 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.374071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.374865 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.375787 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.376108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fdee895b-c681-4d66-bd0d-08622d97cbea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.376122 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdee895b-c681-4d66-bd0d-08622d97cbea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.376697 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.376729 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb4102d896a7409da038b36b506d75c51c4e3531f0e961de74150f03e72873b2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.380145 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.387959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw5bc\" (UniqueName: \"kubernetes.io/projected/fdee895b-c681-4d66-bd0d-08622d97cbea-kube-api-access-mw5bc\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.393994 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdee895b-c681-4d66-bd0d-08622d97cbea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.440544 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3ef1133a-1e94-4684-8ce3-5d61442c19d7\") pod \"openstack-cell1-galera-0\" (UID: \"fdee895b-c681-4d66-bd0d-08622d97cbea\") " pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.462300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.475780 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.476994 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.482426 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.482612 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pp725" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.483144 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.492677 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.583741 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-combined-ca-bundle\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.583861 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-kolla-config\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.583919 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-config-data\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.583995 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t642d\" (UniqueName: \"kubernetes.io/projected/409c2465-a2ab-4f57-ba21-0854863bc543-kube-api-access-t642d\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.584026 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-memcached-tls-certs\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686006 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-kolla-config\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-config-data\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686184 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t642d\" (UniqueName: \"kubernetes.io/projected/409c2465-a2ab-4f57-ba21-0854863bc543-kube-api-access-t642d\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-memcached-tls-certs\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-combined-ca-bundle\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.686813 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-kolla-config\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.689660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/409c2465-a2ab-4f57-ba21-0854863bc543-config-data\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.690309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-memcached-tls-certs\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.690729 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409c2465-a2ab-4f57-ba21-0854863bc543-combined-ca-bundle\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.708305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t642d\" (UniqueName: \"kubernetes.io/projected/409c2465-a2ab-4f57-ba21-0854863bc543-kube-api-access-t642d\") pod \"memcached-0\" (UID: \"409c2465-a2ab-4f57-ba21-0854863bc543\") " pod="openstack/memcached-0" Mar 19 09:45:32 crc kubenswrapper[4835]: I0319 09:45:32.820425 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 09:45:34 crc kubenswrapper[4835]: I0319 09:45:34.890743 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:45:34 crc kubenswrapper[4835]: I0319 09:45:34.893332 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:45:34 crc kubenswrapper[4835]: I0319 09:45:34.899189 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2z4hg" Mar 19 09:45:34 crc kubenswrapper[4835]: I0319 09:45:34.907722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.034074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rl66\" (UniqueName: \"kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66\") pod \"kube-state-metrics-0\" (UID: \"4ace457f-965b-4e28-9f09-0c49bc9df9f7\") " pod="openstack/kube-state-metrics-0" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.135893 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rl66\" (UniqueName: \"kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66\") pod \"kube-state-metrics-0\" (UID: \"4ace457f-965b-4e28-9f09-0c49bc9df9f7\") " pod="openstack/kube-state-metrics-0" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.173964 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rl66\" (UniqueName: \"kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66\") pod \"kube-state-metrics-0\" (UID: \"4ace457f-965b-4e28-9f09-0c49bc9df9f7\") " pod="openstack/kube-state-metrics-0" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.253862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.794272 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f"] Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.795837 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.799092 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.804301 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-t7thd" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.810008 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f"] Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.953363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txxq6\" (UniqueName: \"kubernetes.io/projected/22c392c5-b6d9-42e8-bf59-122a846c26a4-kube-api-access-txxq6\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:35 crc kubenswrapper[4835]: I0319 09:45:35.953650 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c392c5-b6d9-42e8-bf59-122a846c26a4-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.056090 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txxq6\" (UniqueName: \"kubernetes.io/projected/22c392c5-b6d9-42e8-bf59-122a846c26a4-kube-api-access-txxq6\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.056168 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c392c5-b6d9-42e8-bf59-122a846c26a4-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.062602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c392c5-b6d9-42e8-bf59-122a846c26a4-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.092439 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txxq6\" (UniqueName: \"kubernetes.io/projected/22c392c5-b6d9-42e8-bf59-122a846c26a4-kube-api-access-txxq6\") pod \"observability-ui-dashboards-7f87b9b85b-pjl7f\" (UID: \"22c392c5-b6d9-42e8-bf59-122a846c26a4\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.110369 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d9fcbf4f8-5btdf"] Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.111693 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.131175 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.143094 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9fcbf4f8-5btdf"] Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.259416 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.259785 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.259958 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prlsm\" (UniqueName: \"kubernetes.io/projected/d524b5e2-97d1-47cc-8474-113fa8e6016a-kube-api-access-prlsm\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.260091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-oauth-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.260203 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-service-ca\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.260609 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-oauth-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.260772 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-trusted-ca-bundle\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.343021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerStarted","Data":"56b21f777d30f6496a8ec04108934b2903266ccfaa394a6c66856ef7d70ef37a"} Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.346625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerStarted","Data":"7173915a29578e8dd3adca834b704f5ae380234913e1dc8c4ed78c6dd1c393b9"} Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363447 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prlsm\" (UniqueName: \"kubernetes.io/projected/d524b5e2-97d1-47cc-8474-113fa8e6016a-kube-api-access-prlsm\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363548 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-oauth-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-service-ca\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-oauth-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.363712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-trusted-ca-bundle\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.365036 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-oauth-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.365076 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-trusted-ca-bundle\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.365978 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.366115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d524b5e2-97d1-47cc-8474-113fa8e6016a-service-ca\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.368403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-oauth-config\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.386032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prlsm\" (UniqueName: \"kubernetes.io/projected/d524b5e2-97d1-47cc-8474-113fa8e6016a-kube-api-access-prlsm\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.386636 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.389732 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.393251 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395040 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395179 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395310 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395449 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gn4lt" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395588 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.395872 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.400504 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.416620 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d524b5e2-97d1-47cc-8474-113fa8e6016a-console-serving-cert\") pod \"console-5d9fcbf4f8-5btdf\" (UID: \"d524b5e2-97d1-47cc-8474-113fa8e6016a\") " pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.438134 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.438195 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.464898 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.464971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465000 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465072 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465141 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465191 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfb67\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.465819 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.479171 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567347 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567392 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567420 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfb67\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567500 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567535 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567566 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567584 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567604 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.567643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.569714 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.571163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.571479 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.571924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.572141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.572958 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.573075 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ef4b4363ab97a32ad5fd78dd1a4d38afe71a7d4a1238e2cd5a3c5110718b90db/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.573163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.573766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.578963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.599717 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfb67\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.623316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:36 crc kubenswrapper[4835]: I0319 09:45:36.771516 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.865820 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfn4c"] Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.868697 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.873388 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.873835 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.874117 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vl5dg" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.875041 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.904366 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c"] Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.936172 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dht6h"] Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.939321 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.978295 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dht6h"] Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-etc-ovs\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-ovn-controller-tls-certs\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995698 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995789 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-lib\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-run\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.995957 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-log-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996035 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkntn\" (UniqueName: \"kubernetes.io/projected/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-kube-api-access-bkntn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996122 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996192 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-combined-ca-bundle\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996276 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a176f0d5-28dd-4c7a-9558-8eaeffdec853-scripts\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996459 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-log\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996660 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-scripts\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:37 crc kubenswrapper[4835]: I0319 09:45:37.996786 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhqj2\" (UniqueName: \"kubernetes.io/projected/a176f0d5-28dd-4c7a-9558-8eaeffdec853-kube-api-access-xhqj2\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.027132 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.029041 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.032564 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.032886 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m28cf" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.032999 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.033098 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.035489 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.038908 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.097920 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.097979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-ovn-controller-tls-certs\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098024 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-lib\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098043 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-run\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098062 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-log-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098088 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkntn\" (UniqueName: \"kubernetes.io/projected/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-kube-api-access-bkntn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098126 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098141 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-config\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-combined-ca-bundle\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098194 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098225 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a176f0d5-28dd-4c7a-9558-8eaeffdec853-scripts\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-log\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpqfz\" (UniqueName: \"kubernetes.io/projected/011f3ac4-3c81-4a32-98a6-00522440cad4-kube-api-access-mpqfz\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-scripts\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098347 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhqj2\" (UniqueName: \"kubernetes.io/projected/a176f0d5-28dd-4c7a-9558-8eaeffdec853-kube-api-access-xhqj2\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098387 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098405 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-etc-ovs\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098926 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-run\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.098930 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-etc-ovs\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.099002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.099219 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-lib\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.099430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-log-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.099688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-var-run-ovn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.101561 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-scripts\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.101732 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a176f0d5-28dd-4c7a-9558-8eaeffdec853-var-log\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.105171 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a176f0d5-28dd-4c7a-9558-8eaeffdec853-scripts\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.114359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkntn\" (UniqueName: \"kubernetes.io/projected/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-kube-api-access-bkntn\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.117273 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-combined-ca-bundle\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.117359 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhqj2\" (UniqueName: \"kubernetes.io/projected/a176f0d5-28dd-4c7a-9558-8eaeffdec853-kube-api-access-xhqj2\") pod \"ovn-controller-ovs-dht6h\" (UID: \"a176f0d5-28dd-4c7a-9558-8eaeffdec853\") " pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.138433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc6a1eab-3ebc-49e6-bb19-f4f127b416a6-ovn-controller-tls-certs\") pod \"ovn-controller-nfn4c\" (UID: \"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6\") " pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.199945 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200124 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-config\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200145 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200181 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200206 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpqfz\" (UniqueName: \"kubernetes.io/projected/011f3ac4-3c81-4a32-98a6-00522440cad4-kube-api-access-mpqfz\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200259 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200408 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.200718 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.201522 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-config\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.201709 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/011f3ac4-3c81-4a32-98a6-00522440cad4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.203676 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.205787 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.207572 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/011f3ac4-3c81-4a32-98a6-00522440cad4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.211705 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.211759 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/789e9d2bcf3bf8831731d6bcb7cb491ba11427bdcc34d3f8df7a2478e21e50c4/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.218870 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpqfz\" (UniqueName: \"kubernetes.io/projected/011f3ac4-3c81-4a32-98a6-00522440cad4-kube-api-access-mpqfz\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.268045 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.282776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-33bcbed3-9293-4712-a8f9-e628f73e05c0\") pod \"ovsdbserver-nb-0\" (UID: \"011f3ac4-3c81-4a32-98a6-00522440cad4\") " pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:38 crc kubenswrapper[4835]: I0319 09:45:38.356573 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.603391 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.615008 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.621455 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dlcgc" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.621940 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.623050 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.623729 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.635542 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.683619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.683819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.683858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp6pz\" (UniqueName: \"kubernetes.io/projected/8a340a11-0445-4522-bd9d-ff96f90a8f16-kube-api-access-zp6pz\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.683923 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.684017 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.684091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.685574 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.685849 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790399 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790453 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp6pz\" (UniqueName: \"kubernetes.io/projected/8a340a11-0445-4522-bd9d-ff96f90a8f16-kube-api-access-zp6pz\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790624 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790739 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.790827 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.791684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-config\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.792896 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.794165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a340a11-0445-4522-bd9d-ff96f90a8f16-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.797406 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.797618 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37ac33ef62211c7275f7946b659540c89a556082234d6212f9ca1b532072f21e/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.797490 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.800616 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.802871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a340a11-0445-4522-bd9d-ff96f90a8f16-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.811582 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp6pz\" (UniqueName: \"kubernetes.io/projected/8a340a11-0445-4522-bd9d-ff96f90a8f16-kube-api-access-zp6pz\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.829163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d37e5e84-06d8-4698-bf21-83e5c6ad9afe\") pod \"ovsdbserver-sb-0\" (UID: \"8a340a11-0445-4522-bd9d-ff96f90a8f16\") " pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:41 crc kubenswrapper[4835]: I0319 09:45:41.987345 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 09:45:46 crc kubenswrapper[4835]: I0319 09:45:46.452374 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerStarted","Data":"c6f9cf4b80284a932ce720794be9f70ef21bf9a5f2e5b9f1f770215c7b530f0c"} Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.747528 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.755905 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfsgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-8t84q_openstack(09eff517-95bc-4a23-932f-d44c82221963): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.757375 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" podUID="09eff517-95bc-4a23-932f-d44c82221963" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.778416 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.778686 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px44t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-q74sf_openstack(a6fdd390-927b-4c4e-98ce-41d763a0f236): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.780500 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.813157 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.813346 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bzr72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-pnmhk_openstack(9580a501-4d3f-4720-8cd3-67c7b1d5aef1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.814524 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.843850 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.843993 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcqlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-v4jqm_openstack(a02e5221-dee0-4444-a734-2d336cc73119): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:45:46 crc kubenswrapper[4835]: E0319 09:45:46.845473 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" podUID="a02e5221-dee0-4444-a734-2d336cc73119" Mar 19 09:45:47 crc kubenswrapper[4835]: I0319 09:45:47.233138 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 09:45:47 crc kubenswrapper[4835]: E0319 09:45:47.463115 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" Mar 19 09:45:47 crc kubenswrapper[4835]: E0319 09:45:47.463541 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.421368 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.444856 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.478695 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" event={"ID":"a02e5221-dee0-4444-a734-2d336cc73119","Type":"ContainerDied","Data":"261b9245aaa37026f1c35e221137c4c1a038863403c2cb331423ac3d10843a05"} Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.478699 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v4jqm" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.480779 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerStarted","Data":"c990d665399399b29e2a5d88a5e7a90c756d6752adc1aa22c9a9abc776cd2427"} Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.482737 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" event={"ID":"09eff517-95bc-4a23-932f-d44c82221963","Type":"ContainerDied","Data":"1a69809c86f7f3296c610bd3570bed58992f894e26024752e1b6dd601d32e487"} Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.482841 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-8t84q" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.537329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfsgt\" (UniqueName: \"kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt\") pod \"09eff517-95bc-4a23-932f-d44c82221963\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.537423 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config\") pod \"09eff517-95bc-4a23-932f-d44c82221963\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.537595 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc\") pod \"09eff517-95bc-4a23-932f-d44c82221963\" (UID: \"09eff517-95bc-4a23-932f-d44c82221963\") " Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.537937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config" (OuterVolumeSpecName: "config") pod "09eff517-95bc-4a23-932f-d44c82221963" (UID: "09eff517-95bc-4a23-932f-d44c82221963"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.538882 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.538885 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09eff517-95bc-4a23-932f-d44c82221963" (UID: "09eff517-95bc-4a23-932f-d44c82221963"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.541810 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt" (OuterVolumeSpecName: "kube-api-access-mfsgt") pod "09eff517-95bc-4a23-932f-d44c82221963" (UID: "09eff517-95bc-4a23-932f-d44c82221963"). InnerVolumeSpecName "kube-api-access-mfsgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.640530 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcqlh\" (UniqueName: \"kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh\") pod \"a02e5221-dee0-4444-a734-2d336cc73119\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.640999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config\") pod \"a02e5221-dee0-4444-a734-2d336cc73119\" (UID: \"a02e5221-dee0-4444-a734-2d336cc73119\") " Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.641667 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfsgt\" (UniqueName: \"kubernetes.io/projected/09eff517-95bc-4a23-932f-d44c82221963-kube-api-access-mfsgt\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.641695 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09eff517-95bc-4a23-932f-d44c82221963-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.642115 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config" (OuterVolumeSpecName: "config") pod "a02e5221-dee0-4444-a734-2d336cc73119" (UID: "a02e5221-dee0-4444-a734-2d336cc73119"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.647237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh" (OuterVolumeSpecName: "kube-api-access-zcqlh") pod "a02e5221-dee0-4444-a734-2d336cc73119" (UID: "a02e5221-dee0-4444-a734-2d336cc73119"). InnerVolumeSpecName "kube-api-access-zcqlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.743167 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcqlh\" (UniqueName: \"kubernetes.io/projected/a02e5221-dee0-4444-a734-2d336cc73119-kube-api-access-zcqlh\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.743196 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02e5221-dee0-4444-a734-2d336cc73119-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.881434 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.897930 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v4jqm"] Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.926361 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:48 crc kubenswrapper[4835]: I0319 09:45:48.941591 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-8t84q"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.581213 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.621291 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.630695 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.639918 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9fcbf4f8-5btdf"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.648946 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.657150 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 09:45:49 crc kubenswrapper[4835]: I0319 09:45:49.672133 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c"] Mar 19 09:45:49 crc kubenswrapper[4835]: W0319 09:45:49.686167 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ace457f_965b_4e28_9f09_0c49bc9df9f7.slice/crio-2d1f7a25dea9c939643054ea34124f3d9d9b0f826c10f18bb82f484810626888 WatchSource:0}: Error finding container 2d1f7a25dea9c939643054ea34124f3d9d9b0f826c10f18bb82f484810626888: Status 404 returned error can't find the container with id 2d1f7a25dea9c939643054ea34124f3d9d9b0f826c10f18bb82f484810626888 Mar 19 09:45:49 crc kubenswrapper[4835]: W0319 09:45:49.698057 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22c392c5_b6d9_42e8_bf59_122a846c26a4.slice/crio-3e041edd0acfc5cb4027cf9c536b2f50fb6b770336abfc5fed27126ddc7c3b57 WatchSource:0}: Error finding container 3e041edd0acfc5cb4027cf9c536b2f50fb6b770336abfc5fed27126ddc7c3b57: Status 404 returned error can't find the container with id 3e041edd0acfc5cb4027cf9c536b2f50fb6b770336abfc5fed27126ddc7c3b57 Mar 19 09:45:49 crc kubenswrapper[4835]: W0319 09:45:49.699376 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19fa63e6_aa31_4a0c_9eab_876b5e6ec9b0.slice/crio-6a5bb8a32416027a0fa3c73cc9c7415984aa47da26e67bfeb906f3c87fd7d5aa WatchSource:0}: Error finding container 6a5bb8a32416027a0fa3c73cc9c7415984aa47da26e67bfeb906f3c87fd7d5aa: Status 404 returned error can't find the container with id 6a5bb8a32416027a0fa3c73cc9c7415984aa47da26e67bfeb906f3c87fd7d5aa Mar 19 09:45:49 crc kubenswrapper[4835]: W0319 09:45:49.727061 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd594ac_051b_4142_ba53_e974d9c5daa5.slice/crio-abe361852cc29a0b5c6c74cf90740b47e2cfc7c99806a50d61002df7a8091518 WatchSource:0}: Error finding container abe361852cc29a0b5c6c74cf90740b47e2cfc7c99806a50d61002df7a8091518: Status 404 returned error can't find the container with id abe361852cc29a0b5c6c74cf90740b47e2cfc7c99806a50d61002df7a8091518 Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.412935 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09eff517-95bc-4a23-932f-d44c82221963" path="/var/lib/kubelet/pods/09eff517-95bc-4a23-932f-d44c82221963/volumes" Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.414357 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02e5221-dee0-4444-a734-2d336cc73119" path="/var/lib/kubelet/pods/a02e5221-dee0-4444-a734-2d336cc73119/volumes" Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.504447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fcbf4f8-5btdf" event={"ID":"d524b5e2-97d1-47cc-8474-113fa8e6016a","Type":"ContainerStarted","Data":"42c7b6a9b2034265d05aa39dac758da7db972ca0f83952c8896dc5235230ce24"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.504503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9fcbf4f8-5btdf" event={"ID":"d524b5e2-97d1-47cc-8474-113fa8e6016a","Type":"ContainerStarted","Data":"ae71eb1b5013cdc81a30348de148eda2137965f89a95d385b46f973f4ea3d141"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.505543 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4ace457f-965b-4e28-9f09-0c49bc9df9f7","Type":"ContainerStarted","Data":"2d1f7a25dea9c939643054ea34124f3d9d9b0f826c10f18bb82f484810626888"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.506993 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" event={"ID":"22c392c5-b6d9-42e8-bf59-122a846c26a4","Type":"ContainerStarted","Data":"3e041edd0acfc5cb4027cf9c536b2f50fb6b770336abfc5fed27126ddc7c3b57"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.508671 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c" event={"ID":"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6","Type":"ContainerStarted","Data":"3d5eed72a75f77b05de739714a7d7bff454bef22e14de9f39dde5c4c30ff518f"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.510223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerStarted","Data":"a019a87ad20e80e5d27e97a78500bd57162d944fdb1518383ddc556356c45145"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.532028 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d9fcbf4f8-5btdf" podStartSLOduration=14.532007655 podStartE2EDuration="14.532007655s" podCreationTimestamp="2026-03-19 09:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:45:50.521548484 +0000 UTC m=+1405.370147091" watchObservedRunningTime="2026-03-19 09:45:50.532007655 +0000 UTC m=+1405.380606242" Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.557836 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerStarted","Data":"20a39ba6498a2092f5c30caf21771baa8e854db4767a37e5233443d564935991"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.560436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerStarted","Data":"6a5bb8a32416027a0fa3c73cc9c7415984aa47da26e67bfeb906f3c87fd7d5aa"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.566301 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerStarted","Data":"21c03c5c9384d15ca34866e25c6fd7c9610210c0afde02081401ef0199d2e255"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.573187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"409c2465-a2ab-4f57-ba21-0854863bc543","Type":"ContainerStarted","Data":"4ee70fc1ac11134c4aaf586b508364d71ff3306e7d56e625c930169ae8a97ebc"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.575334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerStarted","Data":"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3"} Mar 19 09:45:50 crc kubenswrapper[4835]: I0319 09:45:50.579121 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerStarted","Data":"abe361852cc29a0b5c6c74cf90740b47e2cfc7c99806a50d61002df7a8091518"} Mar 19 09:45:51 crc kubenswrapper[4835]: I0319 09:45:51.609304 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dht6h"] Mar 19 09:45:52 crc kubenswrapper[4835]: I0319 09:45:52.626872 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 09:45:53 crc kubenswrapper[4835]: I0319 09:45:53.255282 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 09:45:53 crc kubenswrapper[4835]: I0319 09:45:53.612329 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"011f3ac4-3c81-4a32-98a6-00522440cad4","Type":"ContainerStarted","Data":"35fc43b253bd986083ebe85617fe559b2bd8987a2f8e425306a7fc1b25e9c527"} Mar 19 09:45:53 crc kubenswrapper[4835]: I0319 09:45:53.613839 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dht6h" event={"ID":"a176f0d5-28dd-4c7a-9558-8eaeffdec853","Type":"ContainerStarted","Data":"c217fefa33034c9fbbb979db0f1ed5b431d4d0d883f4daeec353141ee36a16ba"} Mar 19 09:45:54 crc kubenswrapper[4835]: I0319 09:45:54.624172 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a340a11-0445-4522-bd9d-ff96f90a8f16","Type":"ContainerStarted","Data":"cf72a21218c1ba3cc89827a0104111cf4259a9bb27ff8655d65cc84587f644ce"} Mar 19 09:45:56 crc kubenswrapper[4835]: I0319 09:45:56.480176 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:56 crc kubenswrapper[4835]: I0319 09:45:56.480548 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:56 crc kubenswrapper[4835]: I0319 09:45:56.491139 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:56 crc kubenswrapper[4835]: I0319 09:45:56.643046 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 09:45:56 crc kubenswrapper[4835]: I0319 09:45:56.706728 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:45:58 crc kubenswrapper[4835]: I0319 09:45:58.668075 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" event={"ID":"22c392c5-b6d9-42e8-bf59-122a846c26a4","Type":"ContainerStarted","Data":"e7a0f6bc82dce7872f3bac85626ee55436167fd89bc2100f1ac48e5f9852c96d"} Mar 19 09:45:58 crc kubenswrapper[4835]: I0319 09:45:58.670927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerStarted","Data":"5cd5a5f75588088f0f4ef917b50639f1e2c064782cc9920fcd0a706fd9b26233"} Mar 19 09:45:58 crc kubenswrapper[4835]: I0319 09:45:58.695166 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-pjl7f" podStartSLOduration=16.860208251 podStartE2EDuration="23.695143076s" podCreationTimestamp="2026-03-19 09:45:35 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.702021604 +0000 UTC m=+1404.550620191" lastFinishedPulling="2026-03-19 09:45:56.536956389 +0000 UTC m=+1411.385555016" observedRunningTime="2026-03-19 09:45:58.691818897 +0000 UTC m=+1413.540417514" watchObservedRunningTime="2026-03-19 09:45:58.695143076 +0000 UTC m=+1413.543741663" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.679730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c" event={"ID":"fc6a1eab-3ebc-49e6-bb19-f4f127b416a6","Type":"ContainerStarted","Data":"1843e20324207616d69ba0631200d12327edb3f18509e990d78f018b3403fbe2"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.680145 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-nfn4c" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.681532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"409c2465-a2ab-4f57-ba21-0854863bc543","Type":"ContainerStarted","Data":"2dcd67cf3581a497fb43d024d62237a29e2a742cf4c254319be87cf761b4dd71"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.681643 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.682972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"011f3ac4-3c81-4a32-98a6-00522440cad4","Type":"ContainerStarted","Data":"5f3325ae69e454947fd957e50a29f2f42076978f64fc8400b12f09fd5ebef039"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.684700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a340a11-0445-4522-bd9d-ff96f90a8f16","Type":"ContainerStarted","Data":"a4e840a3244d4ae74f2c2b81d43abf1224e5058c6dcbc8464e6100846c4e86cb"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.686315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dht6h" event={"ID":"a176f0d5-28dd-4c7a-9558-8eaeffdec853","Type":"ContainerStarted","Data":"eb3cbcd38dacce2d477809f932ba588a875e80f3e78aed23f6bebed54dabf084"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.688091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4ace457f-965b-4e28-9f09-0c49bc9df9f7","Type":"ContainerStarted","Data":"5f0c4b09db26d1f1d4e3e16d80865c015ec28f5d0a76a0b62971a6f663cee0f2"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.688170 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.689763 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerStarted","Data":"9151a12078517e8c8c02de604569b76a0b254bf2ede5234a6c6798b882f9d710"} Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.708990 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nfn4c" podStartSLOduration=14.845745423 podStartE2EDuration="22.708971113s" podCreationTimestamp="2026-03-19 09:45:37 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.723014927 +0000 UTC m=+1404.571613514" lastFinishedPulling="2026-03-19 09:45:57.586240627 +0000 UTC m=+1412.434839204" observedRunningTime="2026-03-19 09:45:59.703852316 +0000 UTC m=+1414.552450913" watchObservedRunningTime="2026-03-19 09:45:59.708971113 +0000 UTC m=+1414.557569700" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.755522 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=20.205300675 podStartE2EDuration="27.755498042s" podCreationTimestamp="2026-03-19 09:45:32 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.711755095 +0000 UTC m=+1404.560353682" lastFinishedPulling="2026-03-19 09:45:57.261952452 +0000 UTC m=+1412.110551049" observedRunningTime="2026-03-19 09:45:59.719027762 +0000 UTC m=+1414.567626349" watchObservedRunningTime="2026-03-19 09:45:59.755498042 +0000 UTC m=+1414.604096629" Mar 19 09:45:59 crc kubenswrapper[4835]: I0319 09:45:59.766062 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.777220797 podStartE2EDuration="25.766044265s" podCreationTimestamp="2026-03-19 09:45:34 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.690093783 +0000 UTC m=+1404.538692370" lastFinishedPulling="2026-03-19 09:45:58.678917251 +0000 UTC m=+1413.527515838" observedRunningTime="2026-03-19 09:45:59.764054481 +0000 UTC m=+1414.612653068" watchObservedRunningTime="2026-03-19 09:45:59.766044265 +0000 UTC m=+1414.614642852" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.146888 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565226-dh25m"] Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.149398 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.151514 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.152130 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.153196 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.155070 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pnk\" (UniqueName: \"kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk\") pod \"auto-csr-approver-29565226-dh25m\" (UID: \"fb9581e3-765f-47c2-817f-233dd6d968fd\") " pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.157042 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565226-dh25m"] Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.256588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pnk\" (UniqueName: \"kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk\") pod \"auto-csr-approver-29565226-dh25m\" (UID: \"fb9581e3-765f-47c2-817f-233dd6d968fd\") " pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.384663 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pnk\" (UniqueName: \"kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk\") pod \"auto-csr-approver-29565226-dh25m\" (UID: \"fb9581e3-765f-47c2-817f-233dd6d968fd\") " pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.487521 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.701063 4835 generic.go:334] "Generic (PLEG): container finished" podID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerID="f0f69660d1b2a878c1f7b6dc4815c72d4fca557926a8b360428e99ca79cf77b0" exitCode=0 Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.701157 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" event={"ID":"a6fdd390-927b-4c4e-98ce-41d763a0f236","Type":"ContainerDied","Data":"f0f69660d1b2a878c1f7b6dc4815c72d4fca557926a8b360428e99ca79cf77b0"} Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.710630 4835 generic.go:334] "Generic (PLEG): container finished" podID="a176f0d5-28dd-4c7a-9558-8eaeffdec853" containerID="eb3cbcd38dacce2d477809f932ba588a875e80f3e78aed23f6bebed54dabf084" exitCode=0 Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.711039 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dht6h" event={"ID":"a176f0d5-28dd-4c7a-9558-8eaeffdec853","Type":"ContainerDied","Data":"eb3cbcd38dacce2d477809f932ba588a875e80f3e78aed23f6bebed54dabf084"} Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.714234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerStarted","Data":"b02a6b2b42a5b4af63cb1707e7c8f6d5af9a34c6c15e26b41e4e7dbf39d129d5"} Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.721043 4835 generic.go:334] "Generic (PLEG): container finished" podID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerID="a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8" exitCode=0 Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.724947 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" event={"ID":"9580a501-4d3f-4720-8cd3-67c7b1d5aef1","Type":"ContainerDied","Data":"a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8"} Mar 19 09:46:00 crc kubenswrapper[4835]: I0319 09:46:00.996396 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565226-dh25m"] Mar 19 09:46:01 crc kubenswrapper[4835]: W0319 09:46:01.012956 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9581e3_765f_47c2_817f_233dd6d968fd.slice/crio-788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829 WatchSource:0}: Error finding container 788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829: Status 404 returned error can't find the container with id 788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829 Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.731665 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565226-dh25m" event={"ID":"fb9581e3-765f-47c2-817f-233dd6d968fd","Type":"ContainerStarted","Data":"788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829"} Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.735997 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dht6h" event={"ID":"a176f0d5-28dd-4c7a-9558-8eaeffdec853","Type":"ContainerStarted","Data":"1ca03394ee82eb80cceafecc2474f2e83da1de9fd824d7da972a89cac9f9d786"} Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.736051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dht6h" event={"ID":"a176f0d5-28dd-4c7a-9558-8eaeffdec853","Type":"ContainerStarted","Data":"4c9b699223eec7cab251a8705633f8498d587d4860508b74f2899b6a58d2f616"} Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.736205 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.736485 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.740774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" event={"ID":"9580a501-4d3f-4720-8cd3-67c7b1d5aef1","Type":"ContainerStarted","Data":"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e"} Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.741114 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.742686 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" event={"ID":"a6fdd390-927b-4c4e-98ce-41d763a0f236","Type":"ContainerStarted","Data":"5fb4c1604b4d42671aa7399aa5924ff9467a0d6158ced20b22ce09d0cd92d301"} Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.758560 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dht6h" podStartSLOduration=20.541328772 podStartE2EDuration="24.758542454s" podCreationTimestamp="2026-03-19 09:45:37 +0000 UTC" firstStartedPulling="2026-03-19 09:45:53.363564319 +0000 UTC m=+1408.212162906" lastFinishedPulling="2026-03-19 09:45:57.580778001 +0000 UTC m=+1412.429376588" observedRunningTime="2026-03-19 09:46:01.756715885 +0000 UTC m=+1416.605314502" watchObservedRunningTime="2026-03-19 09:46:01.758542454 +0000 UTC m=+1416.607141041" Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.778923 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" podStartSLOduration=3.049387669 podStartE2EDuration="33.778901431s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:29.250148107 +0000 UTC m=+1384.098746694" lastFinishedPulling="2026-03-19 09:45:59.979661869 +0000 UTC m=+1414.828260456" observedRunningTime="2026-03-19 09:46:01.772051137 +0000 UTC m=+1416.620649734" watchObservedRunningTime="2026-03-19 09:46:01.778901431 +0000 UTC m=+1416.627500018" Mar 19 09:46:01 crc kubenswrapper[4835]: I0319 09:46:01.800734 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" podStartSLOduration=3.299731288 podStartE2EDuration="33.800714816s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:29.450003572 +0000 UTC m=+1384.298602159" lastFinishedPulling="2026-03-19 09:45:59.9509871 +0000 UTC m=+1414.799585687" observedRunningTime="2026-03-19 09:46:01.792428674 +0000 UTC m=+1416.641027261" watchObservedRunningTime="2026-03-19 09:46:01.800714816 +0000 UTC m=+1416.649313413" Mar 19 09:46:02 crc kubenswrapper[4835]: I0319 09:46:02.757228 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerID="5cd5a5f75588088f0f4ef917b50639f1e2c064782cc9920fcd0a706fd9b26233" exitCode=0 Mar 19 09:46:02 crc kubenswrapper[4835]: I0319 09:46:02.757395 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerDied","Data":"5cd5a5f75588088f0f4ef917b50639f1e2c064782cc9920fcd0a706fd9b26233"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.770702 4835 generic.go:334] "Generic (PLEG): container finished" podID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerID="9151a12078517e8c8c02de604569b76a0b254bf2ede5234a6c6798b882f9d710" exitCode=0 Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.772143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerDied","Data":"9151a12078517e8c8c02de604569b76a0b254bf2ede5234a6c6798b882f9d710"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.783136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"011f3ac4-3c81-4a32-98a6-00522440cad4","Type":"ContainerStarted","Data":"5ec5e030ed3d2d489803f890b795be33abb2f9f6839858198eeaf7be12c291cf"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.786063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerStarted","Data":"7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.789153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565226-dh25m" event={"ID":"fb9581e3-765f-47c2-817f-233dd6d968fd","Type":"ContainerStarted","Data":"6f87370d06faf202422cfcf749c8b5a7c7ebf90d15961bba40f1d378d1b30ec7"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.796954 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8a340a11-0445-4522-bd9d-ff96f90a8f16","Type":"ContainerStarted","Data":"75256adcf1edaa08366de71fbecb564258b8ad9d57508ec07b08714a2c1850d4"} Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.837295 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.609920108 podStartE2EDuration="34.837272338s" podCreationTimestamp="2026-03-19 09:45:29 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.74585177 +0000 UTC m=+1404.594450357" lastFinishedPulling="2026-03-19 09:45:54.973204 +0000 UTC m=+1409.821802587" observedRunningTime="2026-03-19 09:46:03.831069172 +0000 UTC m=+1418.679667779" watchObservedRunningTime="2026-03-19 09:46:03.837272338 +0000 UTC m=+1418.685870935" Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.854883 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565226-dh25m" podStartSLOduration=1.662090576 podStartE2EDuration="3.854862871s" podCreationTimestamp="2026-03-19 09:46:00 +0000 UTC" firstStartedPulling="2026-03-19 09:46:01.016513995 +0000 UTC m=+1415.865112582" lastFinishedPulling="2026-03-19 09:46:03.20928629 +0000 UTC m=+1418.057884877" observedRunningTime="2026-03-19 09:46:03.84700234 +0000 UTC m=+1418.695600957" watchObservedRunningTime="2026-03-19 09:46:03.854862871 +0000 UTC m=+1418.703461458" Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.865219 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.889613 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.490756309 podStartE2EDuration="23.889593663s" podCreationTimestamp="2026-03-19 09:45:40 +0000 UTC" firstStartedPulling="2026-03-19 09:45:53.775086747 +0000 UTC m=+1408.623685334" lastFinishedPulling="2026-03-19 09:46:03.173924091 +0000 UTC m=+1418.022522688" observedRunningTime="2026-03-19 09:46:03.88206097 +0000 UTC m=+1418.730659557" watchObservedRunningTime="2026-03-19 09:46:03.889593663 +0000 UTC m=+1418.738192250" Mar 19 09:46:03 crc kubenswrapper[4835]: I0319 09:46:03.899554 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.059494862 podStartE2EDuration="27.89953549s" podCreationTimestamp="2026-03-19 09:45:36 +0000 UTC" firstStartedPulling="2026-03-19 09:45:53.363896588 +0000 UTC m=+1408.212495175" lastFinishedPulling="2026-03-19 09:46:03.203937206 +0000 UTC m=+1418.052535803" observedRunningTime="2026-03-19 09:46:03.864874169 +0000 UTC m=+1418.713472756" watchObservedRunningTime="2026-03-19 09:46:03.89953549 +0000 UTC m=+1418.748134107" Mar 19 09:46:04 crc kubenswrapper[4835]: I0319 09:46:04.808540 4835 generic.go:334] "Generic (PLEG): container finished" podID="fb9581e3-765f-47c2-817f-233dd6d968fd" containerID="6f87370d06faf202422cfcf749c8b5a7c7ebf90d15961bba40f1d378d1b30ec7" exitCode=0 Mar 19 09:46:04 crc kubenswrapper[4835]: I0319 09:46:04.808645 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565226-dh25m" event={"ID":"fb9581e3-765f-47c2-817f-233dd6d968fd","Type":"ContainerDied","Data":"6f87370d06faf202422cfcf749c8b5a7c7ebf90d15961bba40f1d378d1b30ec7"} Mar 19 09:46:04 crc kubenswrapper[4835]: I0319 09:46:04.811238 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerStarted","Data":"5d0ed211b308917edaba5288103874e3bf63598f271c81636f1b1490839e51ec"} Mar 19 09:46:04 crc kubenswrapper[4835]: I0319 09:46:04.855429 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.530736534 podStartE2EDuration="33.855410911s" podCreationTimestamp="2026-03-19 09:45:31 +0000 UTC" firstStartedPulling="2026-03-19 09:45:48.211969263 +0000 UTC m=+1403.060567850" lastFinishedPulling="2026-03-19 09:45:56.53664364 +0000 UTC m=+1411.385242227" observedRunningTime="2026-03-19 09:46:04.848776043 +0000 UTC m=+1419.697374630" watchObservedRunningTime="2026-03-19 09:46:04.855410911 +0000 UTC m=+1419.704009488" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.258282 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.356825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.407308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.825553 4835 generic.go:334] "Generic (PLEG): container finished" podID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerID="b02a6b2b42a5b4af63cb1707e7c8f6d5af9a34c6c15e26b41e4e7dbf39d129d5" exitCode=0 Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.825622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerDied","Data":"b02a6b2b42a5b4af63cb1707e7c8f6d5af9a34c6c15e26b41e4e7dbf39d129d5"} Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.826470 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.892769 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 09:46:05 crc kubenswrapper[4835]: I0319 09:46:05.988449 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.082707 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.179278 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.179482 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="dnsmasq-dns" containerID="cri-o://5fb4c1604b4d42671aa7399aa5924ff9467a0d6158ced20b22ce09d0cd92d301" gracePeriod=10 Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.181451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.220446 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.227095 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.230034 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.247338 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.335072 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nt4ph"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.336636 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.340158 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.385814 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nt4ph"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.399477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzntn\" (UniqueName: \"kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.399964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.400047 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.400216 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.428631 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.428696 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.428755 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.429897 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.429962 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9" gracePeriod=600 Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.440380 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504390 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovs-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504426 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c7v9\" (UniqueName: \"kubernetes.io/projected/edbbc028-d9c9-4aae-a3a0-76bead2b6738-kube-api-access-7c7v9\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504490 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504703 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbbc028-d9c9-4aae-a3a0-76bead2b6738-config\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovn-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504801 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzntn\" (UniqueName: \"kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504820 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-combined-ca-bundle\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.504868 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.505623 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.506406 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.506747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.549479 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzntn\" (UniqueName: \"kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn\") pod \"dnsmasq-dns-7fd796d7df-ks29k\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.607056 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29pnk\" (UniqueName: \"kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk\") pod \"fb9581e3-765f-47c2-817f-233dd6d968fd\" (UID: \"fb9581e3-765f-47c2-817f-233dd6d968fd\") " Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.607910 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.608132 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="dnsmasq-dns" containerID="cri-o://4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e" gracePeriod=10 Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.608862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbbc028-d9c9-4aae-a3a0-76bead2b6738-config\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.608948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovn-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.609011 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-combined-ca-bundle\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.609263 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovs-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.609316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c7v9\" (UniqueName: \"kubernetes.io/projected/edbbc028-d9c9-4aae-a3a0-76bead2b6738-kube-api-access-7c7v9\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.609422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.610580 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovs-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.611978 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.612716 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/edbbc028-d9c9-4aae-a3a0-76bead2b6738-ovn-rundir\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.619044 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edbbc028-d9c9-4aae-a3a0-76bead2b6738-config\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.624982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-combined-ca-bundle\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.633723 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/edbbc028-d9c9-4aae-a3a0-76bead2b6738-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.635108 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk" (OuterVolumeSpecName: "kube-api-access-29pnk") pod "fb9581e3-765f-47c2-817f-233dd6d968fd" (UID: "fb9581e3-765f-47c2-817f-233dd6d968fd"). InnerVolumeSpecName "kube-api-access-29pnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.638924 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c7v9\" (UniqueName: \"kubernetes.io/projected/edbbc028-d9c9-4aae-a3a0-76bead2b6738-kube-api-access-7c7v9\") pod \"ovn-controller-metrics-nt4ph\" (UID: \"edbbc028-d9c9-4aae-a3a0-76bead2b6738\") " pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.655178 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:06 crc kubenswrapper[4835]: E0319 09:46:06.655661 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9581e3-765f-47c2-817f-233dd6d968fd" containerName="oc" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.655679 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9581e3-765f-47c2-817f-233dd6d968fd" containerName="oc" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.655892 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9581e3-765f-47c2-817f-233dd6d968fd" containerName="oc" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.658528 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.664128 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.711308 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29pnk\" (UniqueName: \"kubernetes.io/projected/fb9581e3-765f-47c2-817f-233dd6d968fd-kube-api-access-29pnk\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.717300 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.738248 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.788192 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nt4ph" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.812494 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.812556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.812592 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.812711 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.812777 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrvl\" (UniqueName: \"kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.902943 4835 generic.go:334] "Generic (PLEG): container finished" podID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerID="5fb4c1604b4d42671aa7399aa5924ff9467a0d6158ced20b22ce09d0cd92d301" exitCode=0 Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.903012 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" event={"ID":"a6fdd390-927b-4c4e-98ce-41d763a0f236","Type":"ContainerDied","Data":"5fb4c1604b4d42671aa7399aa5924ff9467a0d6158ced20b22ce09d0cd92d301"} Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.918867 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.918918 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrvl\" (UniqueName: \"kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.919002 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.919037 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.919056 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.919867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.920377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.920927 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.921202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.930574 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565226-dh25m" event={"ID":"fb9581e3-765f-47c2-817f-233dd6d968fd","Type":"ContainerDied","Data":"788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829"} Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.930618 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="788b821ab11b563f0c188f8f34d294bb6860ff7f11398f4faab2f48bc40d4829" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.930699 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565226-dh25m" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.952045 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrvl\" (UniqueName: \"kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl\") pod \"dnsmasq-dns-86db49b7ff-jmqdb\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.952439 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9" exitCode=0 Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.952842 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9"} Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.952908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3"} Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.952925 4835 scope.go:117] "RemoveContainer" containerID="0972a6e7a053fa2e9ebcd3097c3b2861379219ae12b073a6c01213db93a74f6f" Mar 19 09:46:06 crc kubenswrapper[4835]: I0319 09:46:06.954006 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:06.997383 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.046039 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.329909 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.331762 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.337978 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-72dnq" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.340213 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.340389 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.353205 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.356816 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437094 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437149 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-scripts\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437514 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-config\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437600 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.437658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfllj\" (UniqueName: \"kubernetes.io/projected/ebda93b4-fc03-4e5f-a63b-7945275df157-kube-api-access-vfllj\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539615 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-config\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfllj\" (UniqueName: \"kubernetes.io/projected/ebda93b4-fc03-4e5f-a63b-7945275df157-kube-api-access-vfllj\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539768 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539799 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539819 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-scripts\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.539906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.541255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-config\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.543121 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebda93b4-fc03-4e5f-a63b-7945275df157-scripts\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.544898 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.550499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.551187 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.562498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebda93b4-fc03-4e5f-a63b-7945275df157-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.571833 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565220-dvxr7"] Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.576659 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfllj\" (UniqueName: \"kubernetes.io/projected/ebda93b4-fc03-4e5f-a63b-7945275df157-kube-api-access-vfllj\") pod \"ovn-northd-0\" (UID: \"ebda93b4-fc03-4e5f-a63b-7945275df157\") " pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.591099 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565220-dvxr7"] Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.629555 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nt4ph"] Mar 19 09:46:07 crc kubenswrapper[4835]: W0319 09:46:07.641202 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedbbc028_d9c9_4aae_a3a0_76bead2b6738.slice/crio-a278866dcba72a41bd0b2e099e1035d0a075e69ed7bb3d5a18097dcfd15a1147 WatchSource:0}: Error finding container a278866dcba72a41bd0b2e099e1035d0a075e69ed7bb3d5a18097dcfd15a1147: Status 404 returned error can't find the container with id a278866dcba72a41bd0b2e099e1035d0a075e69ed7bb3d5a18097dcfd15a1147 Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.644460 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.683244 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.743142 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px44t\" (UniqueName: \"kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t\") pod \"a6fdd390-927b-4c4e-98ce-41d763a0f236\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.743236 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc\") pod \"a6fdd390-927b-4c4e-98ce-41d763a0f236\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.743357 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config\") pod \"a6fdd390-927b-4c4e-98ce-41d763a0f236\" (UID: \"a6fdd390-927b-4c4e-98ce-41d763a0f236\") " Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.747171 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t" (OuterVolumeSpecName: "kube-api-access-px44t") pod "a6fdd390-927b-4c4e-98ce-41d763a0f236" (UID: "a6fdd390-927b-4c4e-98ce-41d763a0f236"). InnerVolumeSpecName "kube-api-access-px44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.822166 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.828627 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config" (OuterVolumeSpecName: "config") pod "a6fdd390-927b-4c4e-98ce-41d763a0f236" (UID: "a6fdd390-927b-4c4e-98ce-41d763a0f236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.845808 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.845850 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px44t\" (UniqueName: \"kubernetes.io/projected/a6fdd390-927b-4c4e-98ce-41d763a0f236-kube-api-access-px44t\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.852849 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6fdd390-927b-4c4e-98ce-41d763a0f236" (UID: "a6fdd390-927b-4c4e-98ce-41d763a0f236"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:07 crc kubenswrapper[4835]: I0319 09:46:07.948367 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6fdd390-927b-4c4e-98ce-41d763a0f236-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:07.997253 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.021304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nt4ph" event={"ID":"edbbc028-d9c9-4aae-a3a0-76bead2b6738","Type":"ContainerStarted","Data":"a278866dcba72a41bd0b2e099e1035d0a075e69ed7bb3d5a18097dcfd15a1147"} Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.032351 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.059187 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc\") pod \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.059356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzr72\" (UniqueName: \"kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72\") pod \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.059401 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config\") pod \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\" (UID: \"9580a501-4d3f-4720-8cd3-67c7b1d5aef1\") " Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.081078 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.083601 4835 generic.go:334] "Generic (PLEG): container finished" podID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerID="4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e" exitCode=0 Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.083778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" event={"ID":"9580a501-4d3f-4720-8cd3-67c7b1d5aef1","Type":"ContainerDied","Data":"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e"} Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.083811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" event={"ID":"9580a501-4d3f-4720-8cd3-67c7b1d5aef1","Type":"ContainerDied","Data":"94b1a266818d0c7c557a90900ae8c09ef4c4fea3b116c20c7066d8c1b2e8b28d"} Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.083849 4835 scope.go:117] "RemoveContainer" containerID="4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.084009 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-pnmhk" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.099106 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72" (OuterVolumeSpecName: "kube-api-access-bzr72") pod "9580a501-4d3f-4720-8cd3-67c7b1d5aef1" (UID: "9580a501-4d3f-4720-8cd3-67c7b1d5aef1"). InnerVolumeSpecName "kube-api-access-bzr72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:08 crc kubenswrapper[4835]: W0319 09:46:08.100132 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e1c1800_e0f5_49d6_a8ba_2f6184a6343f.slice/crio-14aba6c821a61d2ef9dda06e86869e1b037f58840d094a3c5440e8f0cacccfce WatchSource:0}: Error finding container 14aba6c821a61d2ef9dda06e86869e1b037f58840d094a3c5440e8f0cacccfce: Status 404 returned error can't find the container with id 14aba6c821a61d2ef9dda06e86869e1b037f58840d094a3c5440e8f0cacccfce Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.108121 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" event={"ID":"a6fdd390-927b-4c4e-98ce-41d763a0f236","Type":"ContainerDied","Data":"a8dc5fcfff263a23e8eeb833bd8a74d2a39999e2e7a246606d12e4aa5140ac54"} Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.108283 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-q74sf" Mar 19 09:46:08 crc kubenswrapper[4835]: W0319 09:46:08.109535 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77055cf6_3221_4567_8c5d_4b21cf5887bb.slice/crio-7bf7ff04f56ecae064576065c19d3d273b9c982b35e95aec8a8a7fe734105d6d WatchSource:0}: Error finding container 7bf7ff04f56ecae064576065c19d3d273b9c982b35e95aec8a8a7fe734105d6d: Status 404 returned error can't find the container with id 7bf7ff04f56ecae064576065c19d3d273b9c982b35e95aec8a8a7fe734105d6d Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.162456 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzr72\" (UniqueName: \"kubernetes.io/projected/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-kube-api-access-bzr72\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.185165 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.193404 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config" (OuterVolumeSpecName: "config") pod "9580a501-4d3f-4720-8cd3-67c7b1d5aef1" (UID: "9580a501-4d3f-4720-8cd3-67c7b1d5aef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.196963 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9580a501-4d3f-4720-8cd3-67c7b1d5aef1" (UID: "9580a501-4d3f-4720-8cd3-67c7b1d5aef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.211238 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-q74sf"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.265112 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.265433 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9580a501-4d3f-4720-8cd3-67c7b1d5aef1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.423015 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064a6812-d853-4ec4-9c7e-9de71456672b" path="/var/lib/kubelet/pods/064a6812-d853-4ec4-9c7e-9de71456672b/volumes" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.424542 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" path="/var/lib/kubelet/pods/a6fdd390-927b-4c4e-98ce-41d763a0f236/volumes" Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.425341 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.460533 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:46:08 crc kubenswrapper[4835]: I0319 09:46:08.476008 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-pnmhk"] Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.119474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" event={"ID":"77055cf6-3221-4567-8c5d-4b21cf5887bb","Type":"ContainerStarted","Data":"7bf7ff04f56ecae064576065c19d3d273b9c982b35e95aec8a8a7fe734105d6d"} Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.121774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" event={"ID":"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f","Type":"ContainerStarted","Data":"14aba6c821a61d2ef9dda06e86869e1b037f58840d094a3c5440e8f0cacccfce"} Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.349336 4835 scope.go:117] "RemoveContainer" containerID="a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.568421 4835 scope.go:117] "RemoveContainer" containerID="4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e" Mar 19 09:46:09 crc kubenswrapper[4835]: E0319 09:46:09.569251 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e\": container with ID starting with 4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e not found: ID does not exist" containerID="4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.569298 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e"} err="failed to get container status \"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e\": rpc error: code = NotFound desc = could not find container \"4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e\": container with ID starting with 4f80f14b2f2f7212f438776fcc738df9ccbde69b464d7d7640843438fb77775e not found: ID does not exist" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.569333 4835 scope.go:117] "RemoveContainer" containerID="a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8" Mar 19 09:46:09 crc kubenswrapper[4835]: E0319 09:46:09.569895 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8\": container with ID starting with a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8 not found: ID does not exist" containerID="a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.569932 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8"} err="failed to get container status \"a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8\": rpc error: code = NotFound desc = could not find container \"a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8\": container with ID starting with a06be27eb97e43107452edc545ffdc0af3738579e1d1cbd71f631db7cf82e2a8 not found: ID does not exist" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.569984 4835 scope.go:117] "RemoveContainer" containerID="5fb4c1604b4d42671aa7399aa5924ff9467a0d6158ced20b22ce09d0cd92d301" Mar 19 09:46:09 crc kubenswrapper[4835]: I0319 09:46:09.602966 4835 scope.go:117] "RemoveContainer" containerID="f0f69660d1b2a878c1f7b6dc4815c72d4fca557926a8b360428e99ca79cf77b0" Mar 19 09:46:10 crc kubenswrapper[4835]: I0319 09:46:10.138884 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebda93b4-fc03-4e5f-a63b-7945275df157","Type":"ContainerStarted","Data":"67df0b8ea003e0d3fcb8a775291c5fd6bc616f219d6742d1501c2a5e133e8e7a"} Mar 19 09:46:10 crc kubenswrapper[4835]: I0319 09:46:10.413716 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" path="/var/lib/kubelet/pods/9580a501-4d3f-4720-8cd3-67c7b1d5aef1/volumes" Mar 19 09:46:11 crc kubenswrapper[4835]: I0319 09:46:11.048851 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 09:46:11 crc kubenswrapper[4835]: I0319 09:46:11.049139 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 09:46:11 crc kubenswrapper[4835]: I0319 09:46:11.148968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nt4ph" event={"ID":"edbbc028-d9c9-4aae-a3a0-76bead2b6738","Type":"ContainerStarted","Data":"2704f45b0e6b6694c37ea8d374993d08dbd44c6e078f3e015fac73a68be02fcd"} Mar 19 09:46:11 crc kubenswrapper[4835]: I0319 09:46:11.181176 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nt4ph" podStartSLOduration=5.181157113 podStartE2EDuration="5.181157113s" podCreationTimestamp="2026-03-19 09:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:11.167239273 +0000 UTC m=+1426.015837860" watchObservedRunningTime="2026-03-19 09:46:11.181157113 +0000 UTC m=+1426.029755700" Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.158114 4835 generic.go:334] "Generic (PLEG): container finished" podID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerID="3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1" exitCode=0 Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.158184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" event={"ID":"77055cf6-3221-4567-8c5d-4b21cf5887bb","Type":"ContainerDied","Data":"3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1"} Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.169996 4835 generic.go:334] "Generic (PLEG): container finished" podID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerID="b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5" exitCode=0 Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.170805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" event={"ID":"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f","Type":"ContainerDied","Data":"b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5"} Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.463894 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 09:46:12 crc kubenswrapper[4835]: I0319 09:46:12.464026 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 09:46:12 crc kubenswrapper[4835]: E0319 09:46:12.738651 4835 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.116:49222->38.129.56.116:37913: read tcp 38.129.56.116:49222->38.129.56.116:37913: read: connection reset by peer Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.527103 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.599397 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.870401 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2eab-account-create-update-9n5z8"] Mar 19 09:46:13 crc kubenswrapper[4835]: E0319 09:46:13.870952 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="init" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.870976 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="init" Mar 19 09:46:13 crc kubenswrapper[4835]: E0319 09:46:13.871009 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="init" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.871018 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="init" Mar 19 09:46:13 crc kubenswrapper[4835]: E0319 09:46:13.871045 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.871054 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: E0319 09:46:13.871068 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.871076 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.871306 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9580a501-4d3f-4720-8cd3-67c7b1d5aef1" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.871343 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6fdd390-927b-4c4e-98ce-41d763a0f236" containerName="dnsmasq-dns" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.872230 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.877836 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.882898 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2eab-account-create-update-9n5z8"] Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.921057 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5vwdz"] Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.922379 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:13 crc kubenswrapper[4835]: I0319 09:46:13.930898 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5vwdz"] Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.018826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.018907 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.019089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw26p\" (UniqueName: \"kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.019433 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qqv\" (UniqueName: \"kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.032630 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rv6q7"] Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.034218 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.041629 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rv6q7"] Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.120735 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.121145 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw26p\" (UniqueName: \"kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.121237 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.121297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z6n\" (UniqueName: \"kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.121335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qqv\" (UniqueName: \"kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.121551 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.122090 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.122513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.138892 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f31-account-create-update-wdjbq"] Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.141373 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.145155 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.152416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f31-account-create-update-wdjbq"] Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.153078 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw26p\" (UniqueName: \"kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p\") pod \"keystone-2eab-account-create-update-9n5z8\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.159430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qqv\" (UniqueName: \"kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv\") pod \"keystone-db-create-5vwdz\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.193475 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.224582 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2cm\" (UniqueName: \"kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.224773 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.224942 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z6n\" (UniqueName: \"kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.225010 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.226378 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.242622 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.244351 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z6n\" (UniqueName: \"kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n\") pod \"placement-db-create-rv6q7\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.327603 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2cm\" (UniqueName: \"kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.327904 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.328736 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.342600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2cm\" (UniqueName: \"kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm\") pod \"placement-7f31-account-create-update-wdjbq\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.361501 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:14 crc kubenswrapper[4835]: I0319 09:46:14.611557 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.144505 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-6vl25"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.146092 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.169280 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-6vl25"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.180846 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-fac5-account-create-update-hhdpx"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.182080 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.184936 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.224910 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fac5-account-create-update-hhdpx"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.293404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wz4\" (UniqueName: \"kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.293498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.293601 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.293764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92qsl\" (UniqueName: \"kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.385333 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.398423 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.398576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.398716 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92qsl\" (UniqueName: \"kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.398815 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wz4\" (UniqueName: \"kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.400324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.400935 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.419317 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.421900 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.449027 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wz4\" (UniqueName: \"kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4\") pod \"mysqld-exporter-fac5-account-create-update-hhdpx\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.462585 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92qsl\" (UniqueName: \"kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl\") pod \"mysqld-exporter-openstack-db-create-6vl25\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.489352 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.510220 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.602817 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.602854 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.602883 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.602984 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbmsx\" (UniqueName: \"kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.603123 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.704947 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.705273 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbmsx\" (UniqueName: \"kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.705378 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.705459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.705482 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.706102 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.706274 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.706351 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.706450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.729597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbmsx\" (UniqueName: \"kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx\") pod \"dnsmasq-dns-698758b865-z5zlv\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.762945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:15 crc kubenswrapper[4835]: I0319 09:46:15.871888 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.013442 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.113992 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.541729 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.550169 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.558194 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.558381 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.558504 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-s8dc5" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.558624 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.606243 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99792633-55f8-4a37-b7d8-ae770406c69d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-cache\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626231 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-lock\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626423 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkf4n\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-kube-api-access-lkf4n\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.626610 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728677 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-cache\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728714 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-lock\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728782 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728822 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkf4n\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-kube-api-access-lkf4n\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728847 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.728941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99792633-55f8-4a37-b7d8-ae770406c69d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: E0319 09:46:16.728961 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:16 crc kubenswrapper[4835]: E0319 09:46:16.728993 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:16 crc kubenswrapper[4835]: E0319 09:46:16.729043 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:17.229024157 +0000 UTC m=+1432.077622744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.729358 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-lock\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.729564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/99792633-55f8-4a37-b7d8-ae770406c69d-cache\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.732784 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99792633-55f8-4a37-b7d8-ae770406c69d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.733653 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.733695 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d9200b61fecf3bce3549b003e39161d0aed5e10812d3d1c5414838f77c006ec/globalmount\"" pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.754752 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkf4n\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-kube-api-access-lkf4n\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:16 crc kubenswrapper[4835]: I0319 09:46:16.772630 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d3179a44-e8c0-42c5-8858-43699d13aed4\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.190818 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j4wd9"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.192837 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.195565 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.195845 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.196610 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.210444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f31-account-create-update-wdjbq"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.230560 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.239105 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:17 crc kubenswrapper[4835]: E0319 09:46:17.239321 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:17 crc kubenswrapper[4835]: E0319 09:46:17.239352 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:17 crc kubenswrapper[4835]: E0319 09:46:17.239412 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:18.239394384 +0000 UTC m=+1433.087992971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.244569 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j4wd9"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.320491 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" event={"ID":"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f","Type":"ContainerStarted","Data":"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.321337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.323018 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z5zlv" event={"ID":"8825e955-0683-4e69-ae3d-6bddcc9c92e1","Type":"ContainerStarted","Data":"d31d56e413bda3809e531f94b4c607790a8ae6880fd1b1425613eea0a35ea742"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.325602 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f31-account-create-update-wdjbq" event={"ID":"395a12a5-d5f4-4917-8d8e-dd9f06fa1780","Type":"ContainerStarted","Data":"cd5691204548e85e9d706cfaf60369dd0d8f7ec3186357fd9516f8bee201c96d"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.327592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebda93b4-fc03-4e5f-a63b-7945275df157","Type":"ContainerStarted","Data":"707fa4b2e3e289ad3d176e5f4fa47af1e4550d06a2ec08c32c3f39ca5fda1bee"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342634 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342714 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.342880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.350385 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" event={"ID":"77055cf6-3221-4567-8c5d-4b21cf5887bb","Type":"ContainerStarted","Data":"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.350417 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="dnsmasq-dns" containerID="cri-o://f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82" gracePeriod=10 Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.350613 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.357889 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerStarted","Data":"41800ec9c5c6594a3915cc9e48bc8c428444659c43066b4cec9fa243b542a988"} Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.363550 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" podStartSLOduration=11.363514848 podStartE2EDuration="11.363514848s" podCreationTimestamp="2026-03-19 09:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:17.339844088 +0000 UTC m=+1432.188442675" watchObservedRunningTime="2026-03-19 09:46:17.363514848 +0000 UTC m=+1432.212113435" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.390374 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" podStartSLOduration=11.390351762 podStartE2EDuration="11.390351762s" podCreationTimestamp="2026-03-19 09:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:17.375830356 +0000 UTC m=+1432.224428943" watchObservedRunningTime="2026-03-19 09:46:17.390351762 +0000 UTC m=+1432.238950349" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.445808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.445991 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.446064 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.446131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.446212 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.446929 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.449281 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.452316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.454316 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.454941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.457900 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.459016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.485810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.487675 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle\") pod \"swift-ring-rebalance-j4wd9\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.574484 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.600157 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5vwdz"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.631943 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rv6q7"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.670547 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-fac5-account-create-update-hhdpx"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.683690 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2eab-account-create-update-9n5z8"] Mar 19 09:46:17 crc kubenswrapper[4835]: I0319 09:46:17.694353 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-6vl25"] Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.042473 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2hdqc"] Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.044432 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.067380 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.082532 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2hdqc"] Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.165312 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config\") pod \"77055cf6-3221-4567-8c5d-4b21cf5887bb\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.165466 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzntn\" (UniqueName: \"kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn\") pod \"77055cf6-3221-4567-8c5d-4b21cf5887bb\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.165484 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb\") pod \"77055cf6-3221-4567-8c5d-4b21cf5887bb\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.165823 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc\") pod \"77055cf6-3221-4567-8c5d-4b21cf5887bb\" (UID: \"77055cf6-3221-4567-8c5d-4b21cf5887bb\") " Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.166219 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmqrf\" (UniqueName: \"kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.166261 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.235067 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn" (OuterVolumeSpecName: "kube-api-access-xzntn") pod "77055cf6-3221-4567-8c5d-4b21cf5887bb" (UID: "77055cf6-3221-4567-8c5d-4b21cf5887bb"). InnerVolumeSpecName "kube-api-access-xzntn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.245563 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3383-account-create-update-cv6wv"] Mar 19 09:46:18 crc kubenswrapper[4835]: E0319 09:46:18.246077 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="init" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.246093 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="init" Mar 19 09:46:18 crc kubenswrapper[4835]: E0319 09:46:18.246110 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="dnsmasq-dns" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.246116 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="dnsmasq-dns" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.246342 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerName="dnsmasq-dns" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.247266 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.251028 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.262967 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j4wd9"] Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.269258 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.269317 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmqrf\" (UniqueName: \"kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.269360 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.269521 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzntn\" (UniqueName: \"kubernetes.io/projected/77055cf6-3221-4567-8c5d-4b21cf5887bb-kube-api-access-xzntn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:18 crc kubenswrapper[4835]: E0319 09:46:18.271338 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:18 crc kubenswrapper[4835]: E0319 09:46:18.271360 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:18 crc kubenswrapper[4835]: E0319 09:46:18.271423 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:20.271407588 +0000 UTC m=+1435.120006175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.286597 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3383-account-create-update-cv6wv"] Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.290459 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.329526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmqrf\" (UniqueName: \"kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf\") pod \"glance-db-create-2hdqc\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.371013 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4w8s\" (UniqueName: \"kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.371089 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.371213 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2eab-account-create-update-9n5z8" event={"ID":"88297ff9-78bd-4491-9acd-e0a1f0660b0f","Type":"ContainerStarted","Data":"d34528e835b8627d814da5f37f7910078cedc1a4fa0c16371e914e4a61818f39"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.371243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2eab-account-create-update-9n5z8" event={"ID":"88297ff9-78bd-4491-9acd-e0a1f0660b0f","Type":"ContainerStarted","Data":"4f119d7730c37318269aae70dfa3daf5652d23f9312dfecade5b37a524e7e3bc"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.374087 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ebda93b4-fc03-4e5f-a63b-7945275df157","Type":"ContainerStarted","Data":"b5f458cc08bedd0f6428bf131a7d64bed815c3bba687ffc5eded51ccddd5be71"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.374731 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.376207 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" event={"ID":"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb","Type":"ContainerStarted","Data":"9f097192bf1ea52735234add562451c6081b3e254b1c08a825fa920404d8ece8"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.378190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5vwdz" event={"ID":"cca9abbb-a41a-4886-9262-f7bd98c0ce48","Type":"ContainerStarted","Data":"cce1f0235a06bbfc668811bf6030cae35aa35d26412de5cf1983056727eeb6a7"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.379873 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" event={"ID":"5f689499-b2f9-47e8-811a-54bbac418778","Type":"ContainerStarted","Data":"8fcba6962510385517d19e91e97d69d2ed52f1d073e486913048e80751a7a169"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.381196 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rv6q7" event={"ID":"6fe49504-b4bb-41e3-99c5-ce1e8d36296a","Type":"ContainerStarted","Data":"c2694eff6dcaa52ef493f35b06e371ccbac09ea8baf8f6221ebf21a17e10ed86"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.385033 4835 generic.go:334] "Generic (PLEG): container finished" podID="77055cf6-3221-4567-8c5d-4b21cf5887bb" containerID="f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82" exitCode=0 Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.385123 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.385122 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" event={"ID":"77055cf6-3221-4567-8c5d-4b21cf5887bb","Type":"ContainerDied","Data":"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.385200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ks29k" event={"ID":"77055cf6-3221-4567-8c5d-4b21cf5887bb","Type":"ContainerDied","Data":"7bf7ff04f56ecae064576065c19d3d273b9c982b35e95aec8a8a7fe734105d6d"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.385322 4835 scope.go:117] "RemoveContainer" containerID="f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.393620 4835 generic.go:334] "Generic (PLEG): container finished" podID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerID="9d75b5baa61382fa1fc389d72195aaceae51ebdb619f9d6626acd57a0d20ae70" exitCode=0 Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.394595 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z5zlv" event={"ID":"8825e955-0683-4e69-ae3d-6bddcc9c92e1","Type":"ContainerDied","Data":"9d75b5baa61382fa1fc389d72195aaceae51ebdb619f9d6626acd57a0d20ae70"} Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.398575 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2eab-account-create-update-9n5z8" podStartSLOduration=5.398562802 podStartE2EDuration="5.398562802s" podCreationTimestamp="2026-03-19 09:46:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:18.393047065 +0000 UTC m=+1433.241645652" watchObservedRunningTime="2026-03-19 09:46:18.398562802 +0000 UTC m=+1433.247161389" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.412818 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.426314 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.359160842 podStartE2EDuration="11.426291951s" podCreationTimestamp="2026-03-19 09:46:07 +0000 UTC" firstStartedPulling="2026-03-19 09:46:09.361057209 +0000 UTC m=+1424.209655796" lastFinishedPulling="2026-03-19 09:46:16.428188318 +0000 UTC m=+1431.276786905" observedRunningTime="2026-03-19 09:46:18.41501024 +0000 UTC m=+1433.263608827" watchObservedRunningTime="2026-03-19 09:46:18.426291951 +0000 UTC m=+1433.274890538" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.428531 4835 generic.go:334] "Generic (PLEG): container finished" podID="395a12a5-d5f4-4917-8d8e-dd9f06fa1780" containerID="52d7709cd02621ee10aa0637fdf0d5e358afa94bfa5930c5222b5cb8ddc7ff41" exitCode=0 Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.435799 4835 scope.go:117] "RemoveContainer" containerID="3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.478376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4w8s\" (UniqueName: \"kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.479098 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.482396 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.500381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4w8s\" (UniqueName: \"kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s\") pod \"glance-3383-account-create-update-cv6wv\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:18 crc kubenswrapper[4835]: I0319 09:46:18.625867 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.005302 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77055cf6-3221-4567-8c5d-4b21cf5887bb" (UID: "77055cf6-3221-4567-8c5d-4b21cf5887bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.029790 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config" (OuterVolumeSpecName: "config") pod "77055cf6-3221-4567-8c5d-4b21cf5887bb" (UID: "77055cf6-3221-4567-8c5d-4b21cf5887bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.093888 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.093918 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.318977 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77055cf6-3221-4567-8c5d-4b21cf5887bb" (UID: "77055cf6-3221-4567-8c5d-4b21cf5887bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.398647 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77055cf6-3221-4567-8c5d-4b21cf5887bb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.460236 4835 generic.go:334] "Generic (PLEG): container finished" podID="6fe49504-b4bb-41e3-99c5-ce1e8d36296a" containerID="9b6da810c636e1b3400449e71f5105147de8f0243b58b0a004864d1e4be6e552" exitCode=0 Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.462604 4835 generic.go:334] "Generic (PLEG): container finished" podID="9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" containerID="0bc1ee11ba5c200dec1401debb71c733b19a73833695f48a85abba2718e65391" exitCode=0 Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.465622 4835 generic.go:334] "Generic (PLEG): container finished" podID="cca9abbb-a41a-4886-9262-f7bd98c0ce48" containerID="e04602a50138388c4cd67388f10ac76559e131f9d5e67fa5eb8258e1b9270047" exitCode=0 Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.470322 4835 generic.go:334] "Generic (PLEG): container finished" podID="88297ff9-78bd-4491-9acd-e0a1f0660b0f" containerID="d34528e835b8627d814da5f37f7910078cedc1a4fa0c16371e914e4a61818f39" exitCode=0 Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.506639 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" podStartSLOduration=4.506617121 podStartE2EDuration="4.506617121s" podCreationTimestamp="2026-03-19 09:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:19.467254803 +0000 UTC m=+1434.315853390" watchObservedRunningTime="2026-03-19 09:46:19.506617121 +0000 UTC m=+1434.355215708" Mar 19 09:46:19 crc kubenswrapper[4835]: E0319 09:46:19.885453 4835 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.483s" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.885506 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f31-account-create-update-wdjbq" event={"ID":"395a12a5-d5f4-4917-8d8e-dd9f06fa1780","Type":"ContainerDied","Data":"52d7709cd02621ee10aa0637fdf0d5e358afa94bfa5930c5222b5cb8ddc7ff41"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.885621 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j4wd9" event={"ID":"53929f33-eb5f-41e4-8845-d6be1087df58","Type":"ContainerStarted","Data":"2628ed2dc417bd890efcb287f21344c95d30caa1d94923e2d93f1611478cfa0f"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.885659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3383-account-create-update-cv6wv" event={"ID":"f6615456-cf93-49d5-b69c-a83dcbab99da","Type":"ContainerStarted","Data":"454bf7e6d24383c38432e732846dd3fb6f1667c15a9d2a10c4df6cf1b4d0a7b3"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.885682 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" event={"ID":"5f689499-b2f9-47e8-811a-54bbac418778","Type":"ContainerStarted","Data":"358d9b1bf5b265895614261070f151bf40471da3cf9b7739a531314637546af1"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886536 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rv6q7" event={"ID":"6fe49504-b4bb-41e3-99c5-ce1e8d36296a","Type":"ContainerDied","Data":"9b6da810c636e1b3400449e71f5105147de8f0243b58b0a004864d1e4be6e552"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886557 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" event={"ID":"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb","Type":"ContainerDied","Data":"0bc1ee11ba5c200dec1401debb71c733b19a73833695f48a85abba2718e65391"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886568 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5vwdz" event={"ID":"cca9abbb-a41a-4886-9262-f7bd98c0ce48","Type":"ContainerDied","Data":"e04602a50138388c4cd67388f10ac76559e131f9d5e67fa5eb8258e1b9270047"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886585 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2hdqc"] Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886689 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2eab-account-create-update-9n5z8" event={"ID":"88297ff9-78bd-4491-9acd-e0a1f0660b0f","Type":"ContainerDied","Data":"d34528e835b8627d814da5f37f7910078cedc1a4fa0c16371e914e4a61818f39"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886701 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2hdqc" event={"ID":"ab86bcc8-164f-4f89-9d47-e52d3520ea21","Type":"ContainerStarted","Data":"81e7e9340814bd374e2c5fae832244c0eca4ac985556fdfa8972464b76b50fa8"} Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886711 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3383-account-create-update-cv6wv"] Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.886859 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qwlcw"] Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.889804 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qwlcw"] Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.889950 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.892153 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.935847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvr9\" (UniqueName: \"kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:19 crc kubenswrapper[4835]: I0319 09:46:19.936318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.038236 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvr9\" (UniqueName: \"kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.038338 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.039078 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.076194 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.086320 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ks29k"] Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.102646 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvr9\" (UniqueName: \"kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9\") pod \"root-account-create-update-qwlcw\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.206715 4835 scope.go:117] "RemoveContainer" containerID="f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82" Mar 19 09:46:20 crc kubenswrapper[4835]: E0319 09:46:20.207844 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82\": container with ID starting with f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82 not found: ID does not exist" containerID="f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.207897 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82"} err="failed to get container status \"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82\": rpc error: code = NotFound desc = could not find container \"f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82\": container with ID starting with f4d29ae48667ad10e30a7260f35031357aab9dc4b0dab71dac493d8efafd5e82 not found: ID does not exist" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.207928 4835 scope.go:117] "RemoveContainer" containerID="3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1" Mar 19 09:46:20 crc kubenswrapper[4835]: E0319 09:46:20.208259 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1\": container with ID starting with 3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1 not found: ID does not exist" containerID="3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.208291 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1"} err="failed to get container status \"3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1\": rpc error: code = NotFound desc = could not find container \"3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1\": container with ID starting with 3535c178c68bbaeefdf14a46f83b52ff96a3d8bb2736a92a7766c11108359be1 not found: ID does not exist" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.349862 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:20 crc kubenswrapper[4835]: E0319 09:46:20.350130 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:20 crc kubenswrapper[4835]: E0319 09:46:20.350163 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:20 crc kubenswrapper[4835]: E0319 09:46:20.350206 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:24.350189608 +0000 UTC m=+1439.198788195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.359839 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.360735 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.417123 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77055cf6-3221-4567-8c5d-4b21cf5887bb" path="/var/lib/kubelet/pods/77055cf6-3221-4567-8c5d-4b21cf5887bb/volumes" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.451319 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts\") pod \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.451412 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb2cm\" (UniqueName: \"kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm\") pod \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\" (UID: \"395a12a5-d5f4-4917-8d8e-dd9f06fa1780\") " Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.452266 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "395a12a5-d5f4-4917-8d8e-dd9f06fa1780" (UID: "395a12a5-d5f4-4917-8d8e-dd9f06fa1780"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.455311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm" (OuterVolumeSpecName: "kube-api-access-nb2cm") pod "395a12a5-d5f4-4917-8d8e-dd9f06fa1780" (UID: "395a12a5-d5f4-4917-8d8e-dd9f06fa1780"). InnerVolumeSpecName "kube-api-access-nb2cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.494240 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f31-account-create-update-wdjbq" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.494314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f31-account-create-update-wdjbq" event={"ID":"395a12a5-d5f4-4917-8d8e-dd9f06fa1780","Type":"ContainerDied","Data":"cd5691204548e85e9d706cfaf60369dd0d8f7ec3186357fd9516f8bee201c96d"} Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.494996 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd5691204548e85e9d706cfaf60369dd0d8f7ec3186357fd9516f8bee201c96d" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.500149 4835 generic.go:334] "Generic (PLEG): container finished" podID="5f689499-b2f9-47e8-811a-54bbac418778" containerID="358d9b1bf5b265895614261070f151bf40471da3cf9b7739a531314637546af1" exitCode=0 Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.500848 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" event={"ID":"5f689499-b2f9-47e8-811a-54bbac418778","Type":"ContainerDied","Data":"358d9b1bf5b265895614261070f151bf40471da3cf9b7739a531314637546af1"} Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.554382 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:20 crc kubenswrapper[4835]: I0319 09:46:20.554415 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb2cm\" (UniqueName: \"kubernetes.io/projected/395a12a5-d5f4-4917-8d8e-dd9f06fa1780-kube-api-access-nb2cm\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.239710 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.269352 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts\") pod \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.269473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92qsl\" (UniqueName: \"kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl\") pod \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\" (UID: \"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.271216 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" (UID: "9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.285326 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl" (OuterVolumeSpecName: "kube-api-access-92qsl") pod "9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" (UID: "9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb"). InnerVolumeSpecName "kube-api-access-92qsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.370712 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.370752 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92qsl\" (UniqueName: \"kubernetes.io/projected/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb-kube-api-access-92qsl\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.400392 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.409968 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.415879 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qwlcw"] Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.423889 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.471885 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6qqv\" (UniqueName: \"kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv\") pod \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.471984 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts\") pod \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts\") pod \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\" (UID: \"cca9abbb-a41a-4886-9262-f7bd98c0ce48\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472093 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z6n\" (UniqueName: \"kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n\") pod \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472167 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts\") pod \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\" (UID: \"6fe49504-b4bb-41e3-99c5-ce1e8d36296a\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472270 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw26p\" (UniqueName: \"kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p\") pod \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\" (UID: \"88297ff9-78bd-4491-9acd-e0a1f0660b0f\") " Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88297ff9-78bd-4491-9acd-e0a1f0660b0f" (UID: "88297ff9-78bd-4491-9acd-e0a1f0660b0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.472795 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88297ff9-78bd-4491-9acd-e0a1f0660b0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.474585 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca9abbb-a41a-4886-9262-f7bd98c0ce48" (UID: "cca9abbb-a41a-4886-9262-f7bd98c0ce48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.474773 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fe49504-b4bb-41e3-99c5-ce1e8d36296a" (UID: "6fe49504-b4bb-41e3-99c5-ce1e8d36296a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.478060 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p" (OuterVolumeSpecName: "kube-api-access-vw26p") pod "88297ff9-78bd-4491-9acd-e0a1f0660b0f" (UID: "88297ff9-78bd-4491-9acd-e0a1f0660b0f"). InnerVolumeSpecName "kube-api-access-vw26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.491329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv" (OuterVolumeSpecName: "kube-api-access-k6qqv") pod "cca9abbb-a41a-4886-9262-f7bd98c0ce48" (UID: "cca9abbb-a41a-4886-9262-f7bd98c0ce48"). InnerVolumeSpecName "kube-api-access-k6qqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.493051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n" (OuterVolumeSpecName: "kube-api-access-44z6n") pod "6fe49504-b4bb-41e3-99c5-ce1e8d36296a" (UID: "6fe49504-b4bb-41e3-99c5-ce1e8d36296a"). InnerVolumeSpecName "kube-api-access-44z6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.510285 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5vwdz" event={"ID":"cca9abbb-a41a-4886-9262-f7bd98c0ce48","Type":"ContainerDied","Data":"cce1f0235a06bbfc668811bf6030cae35aa35d26412de5cf1983056727eeb6a7"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.510319 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce1f0235a06bbfc668811bf6030cae35aa35d26412de5cf1983056727eeb6a7" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.510376 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5vwdz" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.525581 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerStarted","Data":"d4fee2bd8b33cc930d583cbcb1d4d9543ca77ffd9b5b0008f9aa9e42fd846715"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.534714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rv6q7" event={"ID":"6fe49504-b4bb-41e3-99c5-ce1e8d36296a","Type":"ContainerDied","Data":"c2694eff6dcaa52ef493f35b06e371ccbac09ea8baf8f6221ebf21a17e10ed86"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.534969 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2694eff6dcaa52ef493f35b06e371ccbac09ea8baf8f6221ebf21a17e10ed86" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.535150 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rv6q7" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.538803 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwlcw" event={"ID":"a63f5211-b3aa-4ebd-985a-736978f6591a","Type":"ContainerStarted","Data":"b96af53a8c0be24da8f0fa08ac05cd04b38fdecbc49ad1919b924de19de8f996"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.544247 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2hdqc" event={"ID":"ab86bcc8-164f-4f89-9d47-e52d3520ea21","Type":"ContainerStarted","Data":"12783958acd0d7509484ec7be8b19d8522f5e847cfb690f3ce9451da93445e93"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.548913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" event={"ID":"9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb","Type":"ContainerDied","Data":"9f097192bf1ea52735234add562451c6081b3e254b1c08a825fa920404d8ece8"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.548953 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f097192bf1ea52735234add562451c6081b3e254b1c08a825fa920404d8ece8" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.549027 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-6vl25" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.568324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2eab-account-create-update-9n5z8" event={"ID":"88297ff9-78bd-4491-9acd-e0a1f0660b0f","Type":"ContainerDied","Data":"4f119d7730c37318269aae70dfa3daf5652d23f9312dfecade5b37a524e7e3bc"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.568372 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f119d7730c37318269aae70dfa3daf5652d23f9312dfecade5b37a524e7e3bc" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.568429 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2eab-account-create-update-9n5z8" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.573856 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z5zlv" event={"ID":"8825e955-0683-4e69-ae3d-6bddcc9c92e1","Type":"ContainerStarted","Data":"53b32e6cbfdb2061cf92d560e23b64c9dfe6812bf0ff8e5fe13da0eb152e91dc"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.574032 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.599467 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-2hdqc" podStartSLOduration=3.599439566 podStartE2EDuration="3.599439566s" podCreationTimestamp="2026-03-19 09:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:21.560730445 +0000 UTC m=+1436.409329042" watchObservedRunningTime="2026-03-19 09:46:21.599439566 +0000 UTC m=+1436.448038153" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.603893 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6qqv\" (UniqueName: \"kubernetes.io/projected/cca9abbb-a41a-4886-9262-f7bd98c0ce48-kube-api-access-k6qqv\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.603918 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9abbb-a41a-4886-9262-f7bd98c0ce48-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.603927 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z6n\" (UniqueName: \"kubernetes.io/projected/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-kube-api-access-44z6n\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.603937 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe49504-b4bb-41e3-99c5-ce1e8d36296a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.603945 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw26p\" (UniqueName: \"kubernetes.io/projected/88297ff9-78bd-4491-9acd-e0a1f0660b0f-kube-api-access-vw26p\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.607229 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3383-account-create-update-cv6wv" event={"ID":"f6615456-cf93-49d5-b69c-a83dcbab99da","Type":"ContainerStarted","Data":"77f5e5e525995019516ce9f427feb4956988015516b64086ba33d8b5e0506ffc"} Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.634503 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-z5zlv" podStartSLOduration=6.634483298 podStartE2EDuration="6.634483298s" podCreationTimestamp="2026-03-19 09:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:21.623993699 +0000 UTC m=+1436.472592306" watchObservedRunningTime="2026-03-19 09:46:21.634483298 +0000 UTC m=+1436.483081875" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.646989 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3383-account-create-update-cv6wv" podStartSLOduration=3.646973651 podStartE2EDuration="3.646973651s" podCreationTimestamp="2026-03-19 09:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:21.645166853 +0000 UTC m=+1436.493765440" watchObservedRunningTime="2026-03-19 09:46:21.646973651 +0000 UTC m=+1436.495572238" Mar 19 09:46:21 crc kubenswrapper[4835]: I0319 09:46:21.761871 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5b76d796f4-ht42f" podUID="590e8c8f-f969-46f8-9b98-7247dd0b2601" containerName="console" containerID="cri-o://a2b6e706ebbd2b4ed42bbc6ea9b0c06261bd4a4e924f30a3f3193061b774e635" gracePeriod=15 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.619967 4835 generic.go:334] "Generic (PLEG): container finished" podID="f773ac48-7b51-427e-9c89-34e515bddabb" containerID="21c03c5c9384d15ca34866e25c6fd7c9610210c0afde02081401ef0199d2e255" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.620071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerDied","Data":"21c03c5c9384d15ca34866e25c6fd7c9610210c0afde02081401ef0199d2e255"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.626269 4835 generic.go:334] "Generic (PLEG): container finished" podID="f6615456-cf93-49d5-b69c-a83dcbab99da" containerID="77f5e5e525995019516ce9f427feb4956988015516b64086ba33d8b5e0506ffc" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.626350 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3383-account-create-update-cv6wv" event={"ID":"f6615456-cf93-49d5-b69c-a83dcbab99da","Type":"ContainerDied","Data":"77f5e5e525995019516ce9f427feb4956988015516b64086ba33d8b5e0506ffc"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.629373 4835 generic.go:334] "Generic (PLEG): container finished" podID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerID="94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.629430 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerDied","Data":"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.634328 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b76d796f4-ht42f_590e8c8f-f969-46f8-9b98-7247dd0b2601/console/0.log" Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.634407 4835 generic.go:334] "Generic (PLEG): container finished" podID="590e8c8f-f969-46f8-9b98-7247dd0b2601" containerID="a2b6e706ebbd2b4ed42bbc6ea9b0c06261bd4a4e924f30a3f3193061b774e635" exitCode=2 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.634501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b76d796f4-ht42f" event={"ID":"590e8c8f-f969-46f8-9b98-7247dd0b2601","Type":"ContainerDied","Data":"a2b6e706ebbd2b4ed42bbc6ea9b0c06261bd4a4e924f30a3f3193061b774e635"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.636941 4835 generic.go:334] "Generic (PLEG): container finished" podID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerID="a019a87ad20e80e5d27e97a78500bd57162d944fdb1518383ddc556356c45145" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.637035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerDied","Data":"a019a87ad20e80e5d27e97a78500bd57162d944fdb1518383ddc556356c45145"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.638848 4835 generic.go:334] "Generic (PLEG): container finished" podID="66c76655-cf6d-45e6-904c-147e07a28639" containerID="20a39ba6498a2092f5c30caf21771baa8e854db4767a37e5233443d564935991" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.638971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerDied","Data":"20a39ba6498a2092f5c30caf21771baa8e854db4767a37e5233443d564935991"} Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.648513 4835 generic.go:334] "Generic (PLEG): container finished" podID="ab86bcc8-164f-4f89-9d47-e52d3520ea21" containerID="12783958acd0d7509484ec7be8b19d8522f5e847cfb690f3ce9451da93445e93" exitCode=0 Mar 19 09:46:22 crc kubenswrapper[4835]: I0319 09:46:22.648605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2hdqc" event={"ID":"ab86bcc8-164f-4f89-9d47-e52d3520ea21","Type":"ContainerDied","Data":"12783958acd0d7509484ec7be8b19d8522f5e847cfb690f3ce9451da93445e93"} Mar 19 09:46:23 crc kubenswrapper[4835]: I0319 09:46:23.662442 4835 generic.go:334] "Generic (PLEG): container finished" podID="a63f5211-b3aa-4ebd-985a-736978f6591a" containerID="a9a4096a50aaca1e5dd81597596eb85a6d8c157de3485cff6980053ff8d6a8a0" exitCode=0 Mar 19 09:46:23 crc kubenswrapper[4835]: I0319 09:46:23.662536 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwlcw" event={"ID":"a63f5211-b3aa-4ebd-985a-736978f6591a","Type":"ContainerDied","Data":"a9a4096a50aaca1e5dd81597596eb85a6d8c157de3485cff6980053ff8d6a8a0"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.100017 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.172643 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts\") pod \"5f689499-b2f9-47e8-811a-54bbac418778\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.172689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wz4\" (UniqueName: \"kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4\") pod \"5f689499-b2f9-47e8-811a-54bbac418778\" (UID: \"5f689499-b2f9-47e8-811a-54bbac418778\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.200717 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f689499-b2f9-47e8-811a-54bbac418778" (UID: "5f689499-b2f9-47e8-811a-54bbac418778"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.203138 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4" (OuterVolumeSpecName: "kube-api-access-44wz4") pod "5f689499-b2f9-47e8-811a-54bbac418778" (UID: "5f689499-b2f9-47e8-811a-54bbac418778"). InnerVolumeSpecName "kube-api-access-44wz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.275070 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f689499-b2f9-47e8-811a-54bbac418778-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.275442 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wz4\" (UniqueName: \"kubernetes.io/projected/5f689499-b2f9-47e8-811a-54bbac418778-kube-api-access-44wz4\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.367029 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.377182 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts\") pod \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.377237 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmqrf\" (UniqueName: \"kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf\") pod \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\" (UID: \"ab86bcc8-164f-4f89-9d47-e52d3520ea21\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.377522 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:24 crc kubenswrapper[4835]: E0319 09:46:24.377783 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:24 crc kubenswrapper[4835]: E0319 09:46:24.377803 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:24 crc kubenswrapper[4835]: E0319 09:46:24.377847 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:32.377833821 +0000 UTC m=+1447.226432408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.377972 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab86bcc8-164f-4f89-9d47-e52d3520ea21" (UID: "ab86bcc8-164f-4f89-9d47-e52d3520ea21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.384924 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf" (OuterVolumeSpecName: "kube-api-access-xmqrf") pod "ab86bcc8-164f-4f89-9d47-e52d3520ea21" (UID: "ab86bcc8-164f-4f89-9d47-e52d3520ea21"). InnerVolumeSpecName "kube-api-access-xmqrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.481273 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab86bcc8-164f-4f89-9d47-e52d3520ea21-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.481309 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmqrf\" (UniqueName: \"kubernetes.io/projected/ab86bcc8-164f-4f89-9d47-e52d3520ea21-kube-api-access-xmqrf\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.564671 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.584198 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts\") pod \"f6615456-cf93-49d5-b69c-a83dcbab99da\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.584321 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4w8s\" (UniqueName: \"kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s\") pod \"f6615456-cf93-49d5-b69c-a83dcbab99da\" (UID: \"f6615456-cf93-49d5-b69c-a83dcbab99da\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.585714 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6615456-cf93-49d5-b69c-a83dcbab99da" (UID: "f6615456-cf93-49d5-b69c-a83dcbab99da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.599036 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s" (OuterVolumeSpecName: "kube-api-access-c4w8s") pod "f6615456-cf93-49d5-b69c-a83dcbab99da" (UID: "f6615456-cf93-49d5-b69c-a83dcbab99da"). InnerVolumeSpecName "kube-api-access-c4w8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.686998 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6615456-cf93-49d5-b69c-a83dcbab99da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.687024 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4w8s\" (UniqueName: \"kubernetes.io/projected/f6615456-cf93-49d5-b69c-a83dcbab99da-kube-api-access-c4w8s\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.687355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerStarted","Data":"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.688292 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.690462 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" event={"ID":"5f689499-b2f9-47e8-811a-54bbac418778","Type":"ContainerDied","Data":"8fcba6962510385517d19e91e97d69d2ed52f1d073e486913048e80751a7a169"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.690486 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fcba6962510385517d19e91e97d69d2ed52f1d073e486913048e80751a7a169" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.690544 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-fac5-account-create-update-hhdpx" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.697807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerStarted","Data":"38a3742b4e3223b9426ae4beb4d1a4985646e49408e0f81d4fdca3e372927265"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.698353 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.708485 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerStarted","Data":"79939fe5133102acc25008da72ca46d83f8fa4a45ab7e54544c16a1887d6998d"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.709382 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.711494 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b76d796f4-ht42f_590e8c8f-f969-46f8-9b98-7247dd0b2601/console/0.log" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.711578 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.717714 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2hdqc" event={"ID":"ab86bcc8-164f-4f89-9d47-e52d3520ea21","Type":"ContainerDied","Data":"81e7e9340814bd374e2c5fae832244c0eca4ac985556fdfa8972464b76b50fa8"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.717774 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e7e9340814bd374e2c5fae832244c0eca4ac985556fdfa8972464b76b50fa8" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.717814 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2hdqc" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.725721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerStarted","Data":"ffb00e2578c5c91c77a01d64a58bb95a80da4d289ffc8ac1338d7ab81af642e6"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.726006 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.728227 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3383-account-create-update-cv6wv" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.729975 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3383-account-create-update-cv6wv" event={"ID":"f6615456-cf93-49d5-b69c-a83dcbab99da","Type":"ContainerDied","Data":"454bf7e6d24383c38432e732846dd3fb6f1667c15a9d2a10c4df6cf1b4d0a7b3"} Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.730026 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454bf7e6d24383c38432e732846dd3fb6f1667c15a9d2a10c4df6cf1b4d0a7b3" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.735487 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=43.936653545 podStartE2EDuration="56.735472622s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:35.58877141 +0000 UTC m=+1390.437369997" lastFinishedPulling="2026-03-19 09:45:48.387590487 +0000 UTC m=+1403.236189074" observedRunningTime="2026-03-19 09:46:24.724812849 +0000 UTC m=+1439.573411446" watchObservedRunningTime="2026-03-19 09:46:24.735472622 +0000 UTC m=+1439.584071199" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.759685 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=54.262948441 podStartE2EDuration="56.759665406s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:45.904768325 +0000 UTC m=+1400.753366922" lastFinishedPulling="2026-03-19 09:45:48.4014853 +0000 UTC m=+1403.250083887" observedRunningTime="2026-03-19 09:46:24.75115146 +0000 UTC m=+1439.599750047" watchObservedRunningTime="2026-03-19 09:46:24.759665406 +0000 UTC m=+1439.608263993" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.812723 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.036985939 podStartE2EDuration="56.812700798s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:35.589886109 +0000 UTC m=+1390.438484696" lastFinishedPulling="2026-03-19 09:45:48.365600968 +0000 UTC m=+1403.214199555" observedRunningTime="2026-03-19 09:46:24.80863148 +0000 UTC m=+1439.657230067" watchObservedRunningTime="2026-03-19 09:46:24.812700798 +0000 UTC m=+1439.661299385" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.892019 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.893320 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config" (OuterVolumeSpecName: "console-config") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.893795 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894190 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894273 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894406 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw5f7\" (UniqueName: \"kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894472 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.894569 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca\") pod \"590e8c8f-f969-46f8-9b98-7247dd0b2601\" (UID: \"590e8c8f-f969-46f8-9b98-7247dd0b2601\") " Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.895152 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.895278 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca" (OuterVolumeSpecName: "service-ca") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.897890 4835 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.897920 4835 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.897934 4835 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.897944 4835 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/590e8c8f-f969-46f8-9b98-7247dd0b2601-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.901236 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.910083 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:24 crc kubenswrapper[4835]: I0319 09:46:24.914037 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7" (OuterVolumeSpecName: "kube-api-access-cw5f7") pod "590e8c8f-f969-46f8-9b98-7247dd0b2601" (UID: "590e8c8f-f969-46f8-9b98-7247dd0b2601"). InnerVolumeSpecName "kube-api-access-cw5f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.000076 4835 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.000384 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw5f7\" (UniqueName: \"kubernetes.io/projected/590e8c8f-f969-46f8-9b98-7247dd0b2601-kube-api-access-cw5f7\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.000398 4835 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/590e8c8f-f969-46f8-9b98-7247dd0b2601-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.162048 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.13177295 podStartE2EDuration="57.162030228s" podCreationTimestamp="2026-03-19 09:45:28 +0000 UTC" firstStartedPulling="2026-03-19 09:45:30.347205538 +0000 UTC m=+1385.195804115" lastFinishedPulling="2026-03-19 09:45:48.377462806 +0000 UTC m=+1403.226061393" observedRunningTime="2026-03-19 09:46:24.857538792 +0000 UTC m=+1439.706137379" watchObservedRunningTime="2026-03-19 09:46:25.162030228 +0000 UTC m=+1440.010628815" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.292189 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.409653 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts\") pod \"a63f5211-b3aa-4ebd-985a-736978f6591a\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.414728 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a63f5211-b3aa-4ebd-985a-736978f6591a" (UID: "a63f5211-b3aa-4ebd-985a-736978f6591a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.416358 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsvr9\" (UniqueName: \"kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9\") pod \"a63f5211-b3aa-4ebd-985a-736978f6591a\" (UID: \"a63f5211-b3aa-4ebd-985a-736978f6591a\") " Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.421037 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a63f5211-b3aa-4ebd-985a-736978f6591a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.424486 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9" (OuterVolumeSpecName: "kube-api-access-rsvr9") pod "a63f5211-b3aa-4ebd-985a-736978f6591a" (UID: "a63f5211-b3aa-4ebd-985a-736978f6591a"). InnerVolumeSpecName "kube-api-access-rsvr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441151 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-rh62c"] Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441660 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca9abbb-a41a-4886-9262-f7bd98c0ce48" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441673 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca9abbb-a41a-4886-9262-f7bd98c0ce48" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441692 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6615456-cf93-49d5-b69c-a83dcbab99da" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441698 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6615456-cf93-49d5-b69c-a83dcbab99da" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441725 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88297ff9-78bd-4491-9acd-e0a1f0660b0f" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441731 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="88297ff9-78bd-4491-9acd-e0a1f0660b0f" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441796 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441803 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441813 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe49504-b4bb-41e3-99c5-ce1e8d36296a" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441819 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe49504-b4bb-41e3-99c5-ce1e8d36296a" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441834 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab86bcc8-164f-4f89-9d47-e52d3520ea21" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441841 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab86bcc8-164f-4f89-9d47-e52d3520ea21" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441848 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f689499-b2f9-47e8-811a-54bbac418778" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441855 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f689499-b2f9-47e8-811a-54bbac418778" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441868 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63f5211-b3aa-4ebd-985a-736978f6591a" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441875 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63f5211-b3aa-4ebd-985a-736978f6591a" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441888 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395a12a5-d5f4-4917-8d8e-dd9f06fa1780" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441893 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="395a12a5-d5f4-4917-8d8e-dd9f06fa1780" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: E0319 09:46:25.441910 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590e8c8f-f969-46f8-9b98-7247dd0b2601" containerName="console" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.441915 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="590e8c8f-f969-46f8-9b98-7247dd0b2601" containerName="console" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442099 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="395a12a5-d5f4-4917-8d8e-dd9f06fa1780" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442115 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab86bcc8-164f-4f89-9d47-e52d3520ea21" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442125 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="88297ff9-78bd-4491-9acd-e0a1f0660b0f" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442135 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442147 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63f5211-b3aa-4ebd-985a-736978f6591a" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442154 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f689499-b2f9-47e8-811a-54bbac418778" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442161 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca9abbb-a41a-4886-9262-f7bd98c0ce48" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442168 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6615456-cf93-49d5-b69c-a83dcbab99da" containerName="mariadb-account-create-update" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442180 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="590e8c8f-f969-46f8-9b98-7247dd0b2601" containerName="console" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442190 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe49504-b4bb-41e3-99c5-ce1e8d36296a" containerName="mariadb-database-create" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.442862 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.481622 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-rh62c"] Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.523206 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsvr9\" (UniqueName: \"kubernetes.io/projected/a63f5211-b3aa-4ebd-985a-736978f6591a-kube-api-access-rsvr9\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.548426 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-c348-account-create-update-qmvxx"] Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.550208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.553661 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.561109 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c348-account-create-update-qmvxx"] Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.625383 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs2g6\" (UniqueName: \"kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.625475 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.729206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7dn\" (UniqueName: \"kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.730141 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.730291 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs2g6\" (UniqueName: \"kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.730358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.731353 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.778418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs2g6\" (UniqueName: \"kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6\") pod \"mysqld-exporter-openstack-cell1-db-create-rh62c\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.780093 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b76d796f4-ht42f_590e8c8f-f969-46f8-9b98-7247dd0b2601/console/0.log" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.780211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b76d796f4-ht42f" event={"ID":"590e8c8f-f969-46f8-9b98-7247dd0b2601","Type":"ContainerDied","Data":"59dec698dc06d39d8847df9c271311c59797a54403b8b603cce64aeee037b67a"} Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.780335 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b76d796f4-ht42f" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.780453 4835 scope.go:117] "RemoveContainer" containerID="a2b6e706ebbd2b4ed42bbc6ea9b0c06261bd4a4e924f30a3f3193061b774e635" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.796192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qwlcw" event={"ID":"a63f5211-b3aa-4ebd-985a-736978f6591a","Type":"ContainerDied","Data":"b96af53a8c0be24da8f0fa08ac05cd04b38fdecbc49ad1919b924de19de8f996"} Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.796539 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96af53a8c0be24da8f0fa08ac05cd04b38fdecbc49ad1919b924de19de8f996" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.796628 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qwlcw" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.834057 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j4wd9" event={"ID":"53929f33-eb5f-41e4-8845-d6be1087df58","Type":"ContainerStarted","Data":"0fd03acbfd496e8ddffa973a0e4ed57cba79d144752065f9670e7ea2e42471b1"} Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.837887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.838099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7dn\" (UniqueName: \"kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.842169 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.866819 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7dn\" (UniqueName: \"kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn\") pod \"mysqld-exporter-c348-account-create-update-qmvxx\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.874357 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j4wd9" podStartSLOduration=3.432207073 podStartE2EDuration="8.874343501s" podCreationTimestamp="2026-03-19 09:46:17 +0000 UTC" firstStartedPulling="2026-03-19 09:46:18.435818765 +0000 UTC m=+1433.284417352" lastFinishedPulling="2026-03-19 09:46:23.877955193 +0000 UTC m=+1438.726553780" observedRunningTime="2026-03-19 09:46:25.853971259 +0000 UTC m=+1440.702569846" watchObservedRunningTime="2026-03-19 09:46:25.874343501 +0000 UTC m=+1440.722942088" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.881242 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.884365 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:46:25 crc kubenswrapper[4835]: I0319 09:46:25.896960 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b76d796f4-ht42f"] Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.064878 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.431850 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.438910 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590e8c8f-f969-46f8-9b98-7247dd0b2601" path="/var/lib/kubelet/pods/590e8c8f-f969-46f8-9b98-7247dd0b2601/volumes" Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.439657 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c348-account-create-update-qmvxx"] Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.651508 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-rh62c"] Mar 19 09:46:26 crc kubenswrapper[4835]: W0319 09:46:26.660122 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ca5984a_072e_4b36_840c_4a986fcd553e.slice/crio-0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574 WatchSource:0}: Error finding container 0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574: Status 404 returned error can't find the container with id 0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574 Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.843504 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" event={"ID":"0647dcc1-ea1b-4628-92cd-50e53602f0b7","Type":"ContainerStarted","Data":"6040eb1bad378f6d6b138fdb9e2b08cd9b49b0dae2584eb73ba09bf7889f2f54"} Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.843873 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" event={"ID":"0647dcc1-ea1b-4628-92cd-50e53602f0b7","Type":"ContainerStarted","Data":"8fe03ce5a21bbbdcf11af841f108a07b22b47d98bf3176e76752daed06e5ba5a"} Mar 19 09:46:26 crc kubenswrapper[4835]: I0319 09:46:26.849450 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" event={"ID":"8ca5984a-072e-4b36-840c-4a986fcd553e","Type":"ContainerStarted","Data":"0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574"} Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.000184 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.025975 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" podStartSLOduration=2.025950889 podStartE2EDuration="2.025950889s" podCreationTimestamp="2026-03-19 09:46:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:26.868671442 +0000 UTC m=+1441.717270029" watchObservedRunningTime="2026-03-19 09:46:27.025950889 +0000 UTC m=+1441.874549476" Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.760335 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.813858 4835 scope.go:117] "RemoveContainer" containerID="a5740e3c8b5d0c8823d092baafda22605046c988cb61715f4229d4a9f01f5a9b" Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.865357 4835 generic.go:334] "Generic (PLEG): container finished" podID="8ca5984a-072e-4b36-840c-4a986fcd553e" containerID="2d435cae20d88b390282bfa5cb87a0cf2f09ac071e9ea94af334dd7710f35551" exitCode=0 Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.865584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" event={"ID":"8ca5984a-072e-4b36-840c-4a986fcd553e","Type":"ContainerDied","Data":"2d435cae20d88b390282bfa5cb87a0cf2f09ac071e9ea94af334dd7710f35551"} Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.870489 4835 generic.go:334] "Generic (PLEG): container finished" podID="0647dcc1-ea1b-4628-92cd-50e53602f0b7" containerID="6040eb1bad378f6d6b138fdb9e2b08cd9b49b0dae2584eb73ba09bf7889f2f54" exitCode=0 Mar 19 09:46:27 crc kubenswrapper[4835]: I0319 09:46:27.870542 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" event={"ID":"0647dcc1-ea1b-4628-92cd-50e53602f0b7","Type":"ContainerDied","Data":"6040eb1bad378f6d6b138fdb9e2b08cd9b49b0dae2584eb73ba09bf7889f2f54"} Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.263899 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-bp5cn"] Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.265165 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.269130 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2c5rl" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.269331 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.270622 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bp5cn"] Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.297449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.297557 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn6bs\" (UniqueName: \"kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.297713 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.297732 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.399036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.399083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.399153 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.399227 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn6bs\" (UniqueName: \"kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.406105 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.407712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.420486 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn6bs\" (UniqueName: \"kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.420493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data\") pod \"glance-db-sync-bp5cn\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:28 crc kubenswrapper[4835]: I0319 09:46:28.619655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.511225 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bp5cn"] Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.672811 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.680818 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.727919 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs2g6\" (UniqueName: \"kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6\") pod \"8ca5984a-072e-4b36-840c-4a986fcd553e\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.727975 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k7dn\" (UniqueName: \"kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn\") pod \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.729182 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts\") pod \"8ca5984a-072e-4b36-840c-4a986fcd553e\" (UID: \"8ca5984a-072e-4b36-840c-4a986fcd553e\") " Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.729865 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts\") pod \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\" (UID: \"0647dcc1-ea1b-4628-92cd-50e53602f0b7\") " Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.730534 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ca5984a-072e-4b36-840c-4a986fcd553e" (UID: "8ca5984a-072e-4b36-840c-4a986fcd553e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.731303 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0647dcc1-ea1b-4628-92cd-50e53602f0b7" (UID: "0647dcc1-ea1b-4628-92cd-50e53602f0b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.734950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn" (OuterVolumeSpecName: "kube-api-access-7k7dn") pod "0647dcc1-ea1b-4628-92cd-50e53602f0b7" (UID: "0647dcc1-ea1b-4628-92cd-50e53602f0b7"). InnerVolumeSpecName "kube-api-access-7k7dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.747843 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6" (OuterVolumeSpecName: "kube-api-access-hs2g6") pod "8ca5984a-072e-4b36-840c-4a986fcd553e" (UID: "8ca5984a-072e-4b36-840c-4a986fcd553e"). InnerVolumeSpecName "kube-api-access-hs2g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.833270 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs2g6\" (UniqueName: \"kubernetes.io/projected/8ca5984a-072e-4b36-840c-4a986fcd553e-kube-api-access-hs2g6\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.833299 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k7dn\" (UniqueName: \"kubernetes.io/projected/0647dcc1-ea1b-4628-92cd-50e53602f0b7-kube-api-access-7k7dn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.833310 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ca5984a-072e-4b36-840c-4a986fcd553e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.833320 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0647dcc1-ea1b-4628-92cd-50e53602f0b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.897651 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerStarted","Data":"96620919ddf56cff4cea8ce323102f07abcc81e112ae104a504bd558742c5111"} Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.901228 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.901230 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c348-account-create-update-qmvxx" event={"ID":"0647dcc1-ea1b-4628-92cd-50e53602f0b7","Type":"ContainerDied","Data":"8fe03ce5a21bbbdcf11af841f108a07b22b47d98bf3176e76752daed06e5ba5a"} Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.901369 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe03ce5a21bbbdcf11af841f108a07b22b47d98bf3176e76752daed06e5ba5a" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.902595 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bp5cn" event={"ID":"441782eb-a245-4484-bf4c-de0c77ca19c2","Type":"ContainerStarted","Data":"302d387ff7de73e079277ed6175bc697f1f7cbd29075bbf53c608fa8bdfddb96"} Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.903801 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" event={"ID":"8ca5984a-072e-4b36-840c-4a986fcd553e","Type":"ContainerDied","Data":"0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574"} Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.903841 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b172350b6fcef68131d346680c708b18e72576cc75cc339c7486ad022c18574" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.903976 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-rh62c" Mar 19 09:46:29 crc kubenswrapper[4835]: I0319 09:46:29.933665 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.905551465 podStartE2EDuration="54.933123093s" podCreationTimestamp="2026-03-19 09:45:35 +0000 UTC" firstStartedPulling="2026-03-19 09:45:49.702598879 +0000 UTC m=+1404.551197466" lastFinishedPulling="2026-03-19 09:46:28.730170507 +0000 UTC m=+1443.578769094" observedRunningTime="2026-03-19 09:46:29.926246309 +0000 UTC m=+1444.774844906" watchObservedRunningTime="2026-03-19 09:46:29.933123093 +0000 UTC m=+1444.781721670" Mar 19 09:46:30 crc kubenswrapper[4835]: I0319 09:46:30.875010 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:46:30 crc kubenswrapper[4835]: I0319 09:46:30.941650 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:30 crc kubenswrapper[4835]: I0319 09:46:30.941916 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="dnsmasq-dns" containerID="cri-o://ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab" gracePeriod=10 Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.160477 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qwlcw"] Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.169978 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qwlcw"] Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.578509 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.681015 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config\") pod \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.681088 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb\") pod \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.681136 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb\") pod \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.681172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc\") pod \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.682213 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcrvl\" (UniqueName: \"kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl\") pod \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\" (UID: \"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f\") " Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.687689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl" (OuterVolumeSpecName: "kube-api-access-jcrvl") pod "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" (UID: "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f"). InnerVolumeSpecName "kube-api-access-jcrvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.743242 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config" (OuterVolumeSpecName: "config") pod "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" (UID: "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.745080 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" (UID: "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.752354 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" (UID: "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.770026 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" (UID: "5e1c1800-e0f5-49d6-a8ba-2f6184a6343f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.772336 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.785020 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.785054 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.785069 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.785084 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.785096 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcrvl\" (UniqueName: \"kubernetes.io/projected/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f-kube-api-access-jcrvl\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.936048 4835 generic.go:334] "Generic (PLEG): container finished" podID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerID="ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab" exitCode=0 Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.936117 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.936154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" event={"ID":"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f","Type":"ContainerDied","Data":"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab"} Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.936448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-jmqdb" event={"ID":"5e1c1800-e0f5-49d6-a8ba-2f6184a6343f","Type":"ContainerDied","Data":"14aba6c821a61d2ef9dda06e86869e1b037f58840d094a3c5440e8f0cacccfce"} Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.936484 4835 scope.go:117] "RemoveContainer" containerID="ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.968533 4835 scope.go:117] "RemoveContainer" containerID="b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5" Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.977395 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:31 crc kubenswrapper[4835]: I0319 09:46:31.990009 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-jmqdb"] Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.000361 4835 scope.go:117] "RemoveContainer" containerID="ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab" Mar 19 09:46:32 crc kubenswrapper[4835]: E0319 09:46:32.000836 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab\": container with ID starting with ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab not found: ID does not exist" containerID="ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab" Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.000867 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab"} err="failed to get container status \"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab\": rpc error: code = NotFound desc = could not find container \"ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab\": container with ID starting with ff93c2870d7dfa38465801a7407b46d9358ad36934f6833fd3b41cc8dda1d6ab not found: ID does not exist" Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.000889 4835 scope.go:117] "RemoveContainer" containerID="b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5" Mar 19 09:46:32 crc kubenswrapper[4835]: E0319 09:46:32.001333 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5\": container with ID starting with b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5 not found: ID does not exist" containerID="b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5" Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.001353 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5"} err="failed to get container status \"b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5\": rpc error: code = NotFound desc = could not find container \"b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5\": container with ID starting with b0212c22dc9bf67763219da1272c09859d684bdcdd445d85450add27801c17c5 not found: ID does not exist" Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.394784 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:32 crc kubenswrapper[4835]: E0319 09:46:32.394991 4835 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 09:46:32 crc kubenswrapper[4835]: E0319 09:46:32.395016 4835 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 09:46:32 crc kubenswrapper[4835]: E0319 09:46:32.395074 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift podName:99792633-55f8-4a37-b7d8-ae770406c69d nodeName:}" failed. No retries permitted until 2026-03-19 09:46:48.395057234 +0000 UTC m=+1463.243655821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift") pod "swift-storage-0" (UID: "99792633-55f8-4a37-b7d8-ae770406c69d") : configmap "swift-ring-files" not found Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.416533 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" path="/var/lib/kubelet/pods/5e1c1800-e0f5-49d6-a8ba-2f6184a6343f/volumes" Mar 19 09:46:32 crc kubenswrapper[4835]: I0319 09:46:32.417357 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63f5211-b3aa-4ebd-985a-736978f6591a" path="/var/lib/kubelet/pods/a63f5211-b3aa-4ebd-985a-736978f6591a/volumes" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.270985 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-nfn4c" podUID="fc6a1eab-3ebc-49e6-bb19-f4f127b416a6" containerName="ovn-controller" probeResult="failure" output=< Mar 19 09:46:33 crc kubenswrapper[4835]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 09:46:33 crc kubenswrapper[4835]: > Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.337186 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.341603 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dht6h" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.591187 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfn4c-config-579mr"] Mar 19 09:46:33 crc kubenswrapper[4835]: E0319 09:46:33.591755 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0647dcc1-ea1b-4628-92cd-50e53602f0b7" containerName="mariadb-account-create-update" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.591780 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0647dcc1-ea1b-4628-92cd-50e53602f0b7" containerName="mariadb-account-create-update" Mar 19 09:46:33 crc kubenswrapper[4835]: E0319 09:46:33.591805 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="init" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.591814 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="init" Mar 19 09:46:33 crc kubenswrapper[4835]: E0319 09:46:33.591846 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca5984a-072e-4b36-840c-4a986fcd553e" containerName="mariadb-database-create" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.591854 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca5984a-072e-4b36-840c-4a986fcd553e" containerName="mariadb-database-create" Mar 19 09:46:33 crc kubenswrapper[4835]: E0319 09:46:33.591866 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="dnsmasq-dns" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.591873 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="dnsmasq-dns" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.592102 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0647dcc1-ea1b-4628-92cd-50e53602f0b7" containerName="mariadb-account-create-update" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.592121 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca5984a-072e-4b36-840c-4a986fcd553e" containerName="mariadb-database-create" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.592129 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1c1800-e0f5-49d6-a8ba-2f6184a6343f" containerName="dnsmasq-dns" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.593032 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.595715 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.600395 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-579mr"] Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623221 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623377 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623430 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623477 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.623495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6df7w\" (UniqueName: \"kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725451 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725523 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725651 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725675 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6df7w\" (UniqueName: \"kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725719 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725779 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725901 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.725941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.726304 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.727562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.771265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6df7w\" (UniqueName: \"kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w\") pod \"ovn-controller-nfn4c-config-579mr\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.925649 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.962604 4835 generic.go:334] "Generic (PLEG): container finished" podID="53929f33-eb5f-41e4-8845-d6be1087df58" containerID="0fd03acbfd496e8ddffa973a0e4ed57cba79d144752065f9670e7ea2e42471b1" exitCode=0 Mar 19 09:46:33 crc kubenswrapper[4835]: I0319 09:46:33.963554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j4wd9" event={"ID":"53929f33-eb5f-41e4-8845-d6be1087df58","Type":"ContainerDied","Data":"0fd03acbfd496e8ddffa973a0e4ed57cba79d144752065f9670e7ea2e42471b1"} Mar 19 09:46:34 crc kubenswrapper[4835]: I0319 09:46:34.426557 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-579mr"] Mar 19 09:46:34 crc kubenswrapper[4835]: I0319 09:46:34.973941 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-579mr" event={"ID":"f634084c-f2f8-41c1-8a66-4c1755ba1075","Type":"ContainerStarted","Data":"df9838883e700d81230f5a239c45931c4f3e81bf7b9efe1db7e3363ab93184eb"} Mar 19 09:46:34 crc kubenswrapper[4835]: I0319 09:46:34.974932 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-579mr" event={"ID":"f634084c-f2f8-41c1-8a66-4c1755ba1075","Type":"ContainerStarted","Data":"b2f2db96a8c1d793cdefcbd5475637c79ab8e35aae1ff1f723b36b1276c85626"} Mar 19 09:46:34 crc kubenswrapper[4835]: I0319 09:46:34.990897 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-nfn4c-config-579mr" podStartSLOduration=1.990878778 podStartE2EDuration="1.990878778s" podCreationTimestamp="2026-03-19 09:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:34.988823034 +0000 UTC m=+1449.837421631" watchObservedRunningTime="2026-03-19 09:46:34.990878778 +0000 UTC m=+1449.839477375" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.467784 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.528756 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.528860 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.528991 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.529121 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.529252 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.529421 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.529459 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle\") pod \"53929f33-eb5f-41e4-8845-d6be1087df58\" (UID: \"53929f33-eb5f-41e4-8845-d6be1087df58\") " Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.534581 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.538147 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.579614 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s" (OuterVolumeSpecName: "kube-api-access-tvs6s") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "kube-api-access-tvs6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.592091 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts" (OuterVolumeSpecName: "scripts") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.593221 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:46:35 crc kubenswrapper[4835]: E0319 09:46:35.593786 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53929f33-eb5f-41e4-8845-d6be1087df58" containerName="swift-ring-rebalance" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.593801 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="53929f33-eb5f-41e4-8845-d6be1087df58" containerName="swift-ring-rebalance" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.594002 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="53929f33-eb5f-41e4-8845-d6be1087df58" containerName="swift-ring-rebalance" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.600734 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.611717 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.611894 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.612114 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.612593 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.612847 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "53929f33-eb5f-41e4-8845-d6be1087df58" (UID: "53929f33-eb5f-41e4-8845-d6be1087df58"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631343 4835 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53929f33-eb5f-41e4-8845-d6be1087df58-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631369 4835 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631379 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631388 4835 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53929f33-eb5f-41e4-8845-d6be1087df58-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631397 4835 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631405 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53929f33-eb5f-41e4-8845-d6be1087df58-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.631413 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvs6s\" (UniqueName: \"kubernetes.io/projected/53929f33-eb5f-41e4-8845-d6be1087df58-kube-api-access-tvs6s\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.734709 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.734999 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx27d\" (UniqueName: \"kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.735162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.836785 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx27d\" (UniqueName: \"kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.836886 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.836957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.840786 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.840869 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.854546 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx27d\" (UniqueName: \"kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d\") pod \"mysqld-exporter-0\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " pod="openstack/mysqld-exporter-0" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.985961 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j4wd9" event={"ID":"53929f33-eb5f-41e4-8845-d6be1087df58","Type":"ContainerDied","Data":"2628ed2dc417bd890efcb287f21344c95d30caa1d94923e2d93f1611478cfa0f"} Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.986019 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2628ed2dc417bd890efcb287f21344c95d30caa1d94923e2d93f1611478cfa0f" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.985987 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j4wd9" Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.988374 4835 generic.go:334] "Generic (PLEG): container finished" podID="f634084c-f2f8-41c1-8a66-4c1755ba1075" containerID="df9838883e700d81230f5a239c45931c4f3e81bf7b9efe1db7e3363ab93184eb" exitCode=0 Mar 19 09:46:35 crc kubenswrapper[4835]: I0319 09:46:35.988412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-579mr" event={"ID":"f634084c-f2f8-41c1-8a66-4c1755ba1075","Type":"ContainerDied","Data":"df9838883e700d81230f5a239c45931c4f3e81bf7b9efe1db7e3363ab93184eb"} Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.002650 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.165548 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-lf6bb"] Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.168131 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.170452 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.188010 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lf6bb"] Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.347457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rffm\" (UniqueName: \"kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.347523 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.449521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rffm\" (UniqueName: \"kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.449588 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.450404 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.484814 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rffm\" (UniqueName: \"kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm\") pod \"root-account-create-update-lf6bb\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.491196 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.772241 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:36 crc kubenswrapper[4835]: I0319 09:46:36.780883 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:37 crc kubenswrapper[4835]: I0319 09:46:37.002561 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:38 crc kubenswrapper[4835]: I0319 09:46:38.261299 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-nfn4c" Mar 19 09:46:39 crc kubenswrapper[4835]: I0319 09:46:39.711618 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.037779 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.121250 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.345548 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.345826 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="prometheus" containerID="cri-o://41800ec9c5c6594a3915cc9e48bc8c428444659c43066b4cec9fa243b542a988" gracePeriod=600 Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.345885 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="thanos-sidecar" containerID="cri-o://96620919ddf56cff4cea8ce323102f07abcc81e112ae104a504bd558742c5111" gracePeriod=600 Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.345961 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="config-reloader" containerID="cri-o://d4fee2bd8b33cc930d583cbcb1d4d9543ca77ffd9b5b0008f9aa9e42fd846715" gracePeriod=600 Mar 19 09:46:40 crc kubenswrapper[4835]: I0319 09:46:40.429808 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044565 4835 generic.go:334] "Generic (PLEG): container finished" podID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerID="96620919ddf56cff4cea8ce323102f07abcc81e112ae104a504bd558742c5111" exitCode=0 Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044915 4835 generic.go:334] "Generic (PLEG): container finished" podID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerID="d4fee2bd8b33cc930d583cbcb1d4d9543ca77ffd9b5b0008f9aa9e42fd846715" exitCode=0 Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044926 4835 generic.go:334] "Generic (PLEG): container finished" podID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerID="41800ec9c5c6594a3915cc9e48bc8c428444659c43066b4cec9fa243b542a988" exitCode=0 Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044624 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerDied","Data":"96620919ddf56cff4cea8ce323102f07abcc81e112ae104a504bd558742c5111"} Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044961 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerDied","Data":"d4fee2bd8b33cc930d583cbcb1d4d9543ca77ffd9b5b0008f9aa9e42fd846715"} Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.044974 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerDied","Data":"41800ec9c5c6594a3915cc9e48bc8c428444659c43066b4cec9fa243b542a988"} Mar 19 09:46:41 crc kubenswrapper[4835]: I0319 09:46:41.773322 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.143:9090/-/ready\": dial tcp 10.217.0.143:9090: connect: connection refused" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.261108 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333245 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333281 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333307 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333378 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333415 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333430 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333458 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6df7w\" (UniqueName: \"kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333509 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run\") pod \"f634084c-f2f8-41c1-8a66-4c1755ba1075\" (UID: \"f634084c-f2f8-41c1-8a66-4c1755ba1075\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.333788 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run" (OuterVolumeSpecName: "var-run") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.334180 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.334198 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.334209 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f634084c-f2f8-41c1-8a66-4c1755ba1075-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.334330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.334546 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts" (OuterVolumeSpecName: "scripts") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.340128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w" (OuterVolumeSpecName: "kube-api-access-6df7w") pod "f634084c-f2f8-41c1-8a66-4c1755ba1075" (UID: "f634084c-f2f8-41c1-8a66-4c1755ba1075"). InnerVolumeSpecName "kube-api-access-6df7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.435616 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6df7w\" (UniqueName: \"kubernetes.io/projected/f634084c-f2f8-41c1-8a66-4c1755ba1075-kube-api-access-6df7w\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.436009 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.436027 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f634084c-f2f8-41c1-8a66-4c1755ba1075-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.678458 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-lf6bb"] Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.718484 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.813816 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.826630 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845508 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845594 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845642 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845876 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.845968 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.846037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfb67\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.846174 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.846208 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.846254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets\") pod \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\" (UID: \"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0\") " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.846502 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.847345 4835 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.847616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.847953 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.851090 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67" (OuterVolumeSpecName: "kube-api-access-bfb67") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "kube-api-access-bfb67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.851183 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.852143 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out" (OuterVolumeSpecName: "config-out") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.852146 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.852686 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config" (OuterVolumeSpecName: "config") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.876341 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.885049 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config" (OuterVolumeSpecName: "web-config") pod "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" (UID: "19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950000 4835 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950050 4835 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config-out\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950060 4835 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950070 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950077 4835 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-web-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950111 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") on node \"crc\" " Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950125 4835 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950140 4835 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.950153 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfb67\" (UniqueName: \"kubernetes.io/projected/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0-kube-api-access-bfb67\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.975033 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:46:44 crc kubenswrapper[4835]: I0319 09:46:44.975193 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7") on node "crc" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.052867 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.082117 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lf6bb" event={"ID":"822505cc-9aba-45f0-8e65-3e4393db08f4","Type":"ContainerStarted","Data":"d117320094efac4f58b6a5d0b21f96a1b84c3f74a86b9b4c57c0d37e02780570"} Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.083254 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"677d6ad0-5fac-4b26-af6e-ed13a984e2ba","Type":"ContainerStarted","Data":"9677cb795b2569c7616deb8b63a9d052c8ae72bdb2735d98eb0ec48e07ed8bb7"} Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.089721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0","Type":"ContainerDied","Data":"6a5bb8a32416027a0fa3c73cc9c7415984aa47da26e67bfeb906f3c87fd7d5aa"} Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.089965 4835 scope.go:117] "RemoveContainer" containerID="96620919ddf56cff4cea8ce323102f07abcc81e112ae104a504bd558742c5111" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.090245 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.094472 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-579mr" event={"ID":"f634084c-f2f8-41c1-8a66-4c1755ba1075","Type":"ContainerDied","Data":"b2f2db96a8c1d793cdefcbd5475637c79ab8e35aae1ff1f723b36b1276c85626"} Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.094515 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2f2db96a8c1d793cdefcbd5475637c79ab8e35aae1ff1f723b36b1276c85626" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.094572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-579mr" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.166130 4835 scope.go:117] "RemoveContainer" containerID="d4fee2bd8b33cc930d583cbcb1d4d9543ca77ffd9b5b0008f9aa9e42fd846715" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.177979 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.192629 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.195575 4835 scope.go:117] "RemoveContainer" containerID="41800ec9c5c6594a3915cc9e48bc8c428444659c43066b4cec9fa243b542a988" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.204544 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:45 crc kubenswrapper[4835]: E0319 09:46:45.205028 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f634084c-f2f8-41c1-8a66-4c1755ba1075" containerName="ovn-config" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205051 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f634084c-f2f8-41c1-8a66-4c1755ba1075" containerName="ovn-config" Mar 19 09:46:45 crc kubenswrapper[4835]: E0319 09:46:45.205068 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="prometheus" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205075 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="prometheus" Mar 19 09:46:45 crc kubenswrapper[4835]: E0319 09:46:45.205090 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="init-config-reloader" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205098 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="init-config-reloader" Mar 19 09:46:45 crc kubenswrapper[4835]: E0319 09:46:45.205121 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="thanos-sidecar" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205126 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="thanos-sidecar" Mar 19 09:46:45 crc kubenswrapper[4835]: E0319 09:46:45.205134 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="config-reloader" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205140 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="config-reloader" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205359 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="config-reloader" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205381 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="thanos-sidecar" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205396 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f634084c-f2f8-41c1-8a66-4c1755ba1075" containerName="ovn-config" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.205403 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" containerName="prometheus" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.208082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.214185 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.214377 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.214507 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.214600 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.214907 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-gn4lt" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.215123 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.215429 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.215968 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.220474 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.232734 4835 scope.go:117] "RemoveContainer" containerID="b02a6b2b42a5b4af63cb1707e7c8f6d5af9a34c6c15e26b41e4e7dbf39d129d5" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.240478 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.358880 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.358939 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359002 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wn4r\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-kube-api-access-6wn4r\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359047 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359366 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359608 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359665 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.359941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.360007 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.360091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5614bb9c-3907-4c29-b148-fbca6c6642ad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.360175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.368054 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfn4c-config-579mr"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.381058 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfn4c-config-579mr"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.440029 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfn4c-config-tr55k"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.441448 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.444944 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.451283 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-tr55k"] Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.461614 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462007 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462265 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wn4r\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-kube-api-access-6wn4r\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462600 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462784 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.462935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463183 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463326 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463421 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463513 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5614bb9c-3907-4c29-b148-fbca6c6642ad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.463629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.467224 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.469207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.470664 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.470688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.470688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5614bb9c-3907-4c29-b148-fbca6c6642ad-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.472161 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.472323 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ef4b4363ab97a32ad5fd78dd1a4d38afe71a7d4a1238e2cd5a3c5110718b90db/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.473361 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5614bb9c-3907-4c29-b148-fbca6c6642ad-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.474165 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.495823 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.502110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.502311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.502438 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5614bb9c-3907-4c29-b148-fbca6c6642ad-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.507530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wn4r\" (UniqueName: \"kubernetes.io/projected/5614bb9c-3907-4c29-b148-fbca6c6642ad-kube-api-access-6wn4r\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.545146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-37f6ee03-0306-4d3a-b23c-de70ece0cce7\") pod \"prometheus-metric-storage-0\" (UID: \"5614bb9c-3907-4c29-b148-fbca6c6642ad\") " pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569080 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569200 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-474vl\" (UniqueName: \"kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569303 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.569325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670531 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-474vl\" (UniqueName: \"kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670766 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670788 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.670955 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.672133 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.672984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.689207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-474vl\" (UniqueName: \"kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl\") pod \"ovn-controller-nfn4c-config-tr55k\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.759914 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:45 crc kubenswrapper[4835]: I0319 09:46:45.849015 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.122850 4835 generic.go:334] "Generic (PLEG): container finished" podID="822505cc-9aba-45f0-8e65-3e4393db08f4" containerID="ce75e54be095e2271d27bd56cf7f1e1a9313c5f6b27fff71249621b9f0a18a88" exitCode=0 Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.123123 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lf6bb" event={"ID":"822505cc-9aba-45f0-8e65-3e4393db08f4","Type":"ContainerDied","Data":"ce75e54be095e2271d27bd56cf7f1e1a9313c5f6b27fff71249621b9f0a18a88"} Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.299358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-tr55k"] Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.425711 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0" path="/var/lib/kubelet/pods/19fa63e6-aa31-4a0c-9eab-876b5e6ec9b0/volumes" Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.428421 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f634084c-f2f8-41c1-8a66-4c1755ba1075" path="/var/lib/kubelet/pods/f634084c-f2f8-41c1-8a66-4c1755ba1075/volumes" Mar 19 09:46:46 crc kubenswrapper[4835]: I0319 09:46:46.429888 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 19 09:46:46 crc kubenswrapper[4835]: W0319 09:46:46.511280 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5614bb9c_3907_4c29_b148_fbca6c6642ad.slice/crio-f1ec7cdc24d3d61182fe62573ca036bcc0b5b75e306af45aaa4265726cf403ea WatchSource:0}: Error finding container f1ec7cdc24d3d61182fe62573ca036bcc0b5b75e306af45aaa4265726cf403ea: Status 404 returned error can't find the container with id f1ec7cdc24d3d61182fe62573ca036bcc0b5b75e306af45aaa4265726cf403ea Mar 19 09:46:46 crc kubenswrapper[4835]: W0319 09:46:46.512034 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd35502bf_4eec_4081_a1ac_98f79914dfe3.slice/crio-e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e WatchSource:0}: Error finding container e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e: Status 404 returned error can't find the container with id e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.138084 4835 generic.go:334] "Generic (PLEG): container finished" podID="d35502bf-4eec-4081-a1ac-98f79914dfe3" containerID="34f6c9d3b2fab0858fb310e67b666ba734c54affa7c126ef4036c0e119670a7d" exitCode=0 Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.138144 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-tr55k" event={"ID":"d35502bf-4eec-4081-a1ac-98f79914dfe3","Type":"ContainerDied","Data":"34f6c9d3b2fab0858fb310e67b666ba734c54affa7c126ef4036c0e119670a7d"} Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.138442 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-tr55k" event={"ID":"d35502bf-4eec-4081-a1ac-98f79914dfe3","Type":"ContainerStarted","Data":"e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e"} Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.145020 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bp5cn" event={"ID":"441782eb-a245-4484-bf4c-de0c77ca19c2","Type":"ContainerStarted","Data":"6661211b794df72a5da51ecbae09e55717a95b7f0751f47836827a97095b4745"} Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.147610 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerStarted","Data":"f1ec7cdc24d3d61182fe62573ca036bcc0b5b75e306af45aaa4265726cf403ea"} Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.187591 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-bp5cn" podStartSLOduration=3.136867769 podStartE2EDuration="19.187566967s" podCreationTimestamp="2026-03-19 09:46:28 +0000 UTC" firstStartedPulling="2026-03-19 09:46:29.517463227 +0000 UTC m=+1444.366061814" lastFinishedPulling="2026-03-19 09:46:45.568162425 +0000 UTC m=+1460.416761012" observedRunningTime="2026-03-19 09:46:47.177708555 +0000 UTC m=+1462.026307142" watchObservedRunningTime="2026-03-19 09:46:47.187566967 +0000 UTC m=+1462.036165554" Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.570495 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.623310 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rffm\" (UniqueName: \"kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm\") pod \"822505cc-9aba-45f0-8e65-3e4393db08f4\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.623622 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts\") pod \"822505cc-9aba-45f0-8e65-3e4393db08f4\" (UID: \"822505cc-9aba-45f0-8e65-3e4393db08f4\") " Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.624209 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "822505cc-9aba-45f0-8e65-3e4393db08f4" (UID: "822505cc-9aba-45f0-8e65-3e4393db08f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.629046 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm" (OuterVolumeSpecName: "kube-api-access-2rffm") pod "822505cc-9aba-45f0-8e65-3e4393db08f4" (UID: "822505cc-9aba-45f0-8e65-3e4393db08f4"). InnerVolumeSpecName "kube-api-access-2rffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.726773 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rffm\" (UniqueName: \"kubernetes.io/projected/822505cc-9aba-45f0-8e65-3e4393db08f4-kube-api-access-2rffm\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:47 crc kubenswrapper[4835]: I0319 09:46:47.726811 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/822505cc-9aba-45f0-8e65-3e4393db08f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.163409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"677d6ad0-5fac-4b26-af6e-ed13a984e2ba","Type":"ContainerStarted","Data":"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac"} Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.165397 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-lf6bb" event={"ID":"822505cc-9aba-45f0-8e65-3e4393db08f4","Type":"ContainerDied","Data":"d117320094efac4f58b6a5d0b21f96a1b84c3f74a86b9b4c57c0d37e02780570"} Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.165542 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d117320094efac4f58b6a5d0b21f96a1b84c3f74a86b9b4c57c0d37e02780570" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.165598 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-lf6bb" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.233230 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=10.923426341999999 podStartE2EDuration="13.233206443s" podCreationTimestamp="2026-03-19 09:46:35 +0000 UTC" firstStartedPulling="2026-03-19 09:46:44.826389518 +0000 UTC m=+1459.674988105" lastFinishedPulling="2026-03-19 09:46:47.136169619 +0000 UTC m=+1461.984768206" observedRunningTime="2026-03-19 09:46:48.188963256 +0000 UTC m=+1463.037561853" watchObservedRunningTime="2026-03-19 09:46:48.233206443 +0000 UTC m=+1463.081805030" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.445078 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.455849 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/99792633-55f8-4a37-b7d8-ae770406c69d-etc-swift\") pod \"swift-storage-0\" (UID: \"99792633-55f8-4a37-b7d8-ae770406c69d\") " pod="openstack/swift-storage-0" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.554507 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.600512 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.648776 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.648867 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-474vl\" (UniqueName: \"kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.648932 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.649812 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650138 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts" (OuterVolumeSpecName: "scripts") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650219 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650271 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run" (OuterVolumeSpecName: "var-run") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn\") pod \"d35502bf-4eec-4081-a1ac-98f79914dfe3\" (UID: \"d35502bf-4eec-4081-a1ac-98f79914dfe3\") " Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650384 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.650406 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.651125 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.651150 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d35502bf-4eec-4081-a1ac-98f79914dfe3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.651162 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.651176 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.651194 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d35502bf-4eec-4081-a1ac-98f79914dfe3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.655135 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl" (OuterVolumeSpecName: "kube-api-access-474vl") pod "d35502bf-4eec-4081-a1ac-98f79914dfe3" (UID: "d35502bf-4eec-4081-a1ac-98f79914dfe3"). InnerVolumeSpecName "kube-api-access-474vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:48 crc kubenswrapper[4835]: I0319 09:46:48.752938 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-474vl\" (UniqueName: \"kubernetes.io/projected/d35502bf-4eec-4081-a1ac-98f79914dfe3-kube-api-access-474vl\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.180729 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-tr55k" event={"ID":"d35502bf-4eec-4081-a1ac-98f79914dfe3","Type":"ContainerDied","Data":"e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e"} Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.180918 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e66604e72dc11d8f3515afd36feb3bda69cb5b14a11a53362582e7fc5aed1a8e" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.180765 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-tr55k" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.231159 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.636562 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfn4c-config-tr55k"] Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.645252 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfn4c-config-tr55k"] Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.692565 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-nfn4c-config-5kxtg"] Mar 19 09:46:49 crc kubenswrapper[4835]: E0319 09:46:49.693128 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822505cc-9aba-45f0-8e65-3e4393db08f4" containerName="mariadb-account-create-update" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.693157 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="822505cc-9aba-45f0-8e65-3e4393db08f4" containerName="mariadb-account-create-update" Mar 19 09:46:49 crc kubenswrapper[4835]: E0319 09:46:49.695162 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35502bf-4eec-4081-a1ac-98f79914dfe3" containerName="ovn-config" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.695188 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35502bf-4eec-4081-a1ac-98f79914dfe3" containerName="ovn-config" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.695530 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35502bf-4eec-4081-a1ac-98f79914dfe3" containerName="ovn-config" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.695556 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="822505cc-9aba-45f0-8e65-3e4393db08f4" containerName="mariadb-account-create-update" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.701868 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.705318 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.711727 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-5kxtg"] Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.784362 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.784489 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.784806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.786030 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.786102 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.786364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69dnc\" (UniqueName: \"kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888018 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888115 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888312 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888410 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69dnc\" (UniqueName: \"kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888483 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.888946 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.890917 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:49 crc kubenswrapper[4835]: I0319 09:46:49.913474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69dnc\" (UniqueName: \"kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc\") pod \"ovn-controller-nfn4c-config-5kxtg\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.029641 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.037964 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.120881 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.195941 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"91745c40f22d86141df672c90a56583113550a98af0e5eae15ba37158e3d2e2d"} Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.203038 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerStarted","Data":"6b7b668243420d44cad56f56d5c7018869002b4a0d38d2824542b484567ebdde"} Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.428696 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.438527 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35502bf-4eec-4081-a1ac-98f79914dfe3" path="/var/lib/kubelet/pods/d35502bf-4eec-4081-a1ac-98f79914dfe3/volumes" Mar 19 09:46:50 crc kubenswrapper[4835]: I0319 09:46:50.867290 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-nfn4c-config-5kxtg"] Mar 19 09:46:50 crc kubenswrapper[4835]: W0319 09:46:50.877092 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ef7708_cff0_4879_a502_a060e28042b0.slice/crio-432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d WatchSource:0}: Error finding container 432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d: Status 404 returned error can't find the container with id 432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d Mar 19 09:46:51 crc kubenswrapper[4835]: I0319 09:46:51.214646 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-5kxtg" event={"ID":"63ef7708-cff0-4879-a502-a060e28042b0","Type":"ContainerStarted","Data":"432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d"} Mar 19 09:46:51 crc kubenswrapper[4835]: I0319 09:46:51.218315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"d2f83e0dd8d64f56de2d6949237385e9d9d23f722a5df0e4e8d1d15d08685607"} Mar 19 09:46:51 crc kubenswrapper[4835]: I0319 09:46:51.218363 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"bf4226a9337122cbbbf037c01e30e4e6d354d6d099893fdfdee8cb96285a22f6"} Mar 19 09:46:51 crc kubenswrapper[4835]: I0319 09:46:51.218381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"68e1de87a3ca5b66c44974864d8070c7b4da518322f0fe65a7713e452f4d4287"} Mar 19 09:46:52 crc kubenswrapper[4835]: I0319 09:46:52.235205 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"51d28fa38eae033493aaa1b16e75d31dcbf1f9a7222c503d87b41951f2842e13"} Mar 19 09:46:52 crc kubenswrapper[4835]: I0319 09:46:52.237030 4835 generic.go:334] "Generic (PLEG): container finished" podID="63ef7708-cff0-4879-a502-a060e28042b0" containerID="56bc0f4cbc81b8de82bdb7e1483474926f06bf1097472c7e50752965da149097" exitCode=0 Mar 19 09:46:52 crc kubenswrapper[4835]: I0319 09:46:52.237080 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-5kxtg" event={"ID":"63ef7708-cff0-4879-a502-a060e28042b0","Type":"ContainerDied","Data":"56bc0f4cbc81b8de82bdb7e1483474926f06bf1097472c7e50752965da149097"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.250601 4835 generic.go:334] "Generic (PLEG): container finished" podID="441782eb-a245-4484-bf4c-de0c77ca19c2" containerID="6661211b794df72a5da51ecbae09e55717a95b7f0751f47836827a97095b4745" exitCode=0 Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.250694 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bp5cn" event={"ID":"441782eb-a245-4484-bf4c-de0c77ca19c2","Type":"ContainerDied","Data":"6661211b794df72a5da51ecbae09e55717a95b7f0751f47836827a97095b4745"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.259642 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"8ba40755228c179cf4fe54c79738fcf93189eaa45ffc13f86992da8dda834a0d"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.259697 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"64428278aff3ee1419a4eded5186cffcbbb45904346323b464642c922ed20ded"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.259715 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"f4ff3ae919b8290866be271d8b3571ea4d3ff3ef429bee8d2e7a38869657eb4b"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.259732 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"eb7ff5b5f1fb09a4d10235cb3217da975a95bba8000d95fb0a97d0269be7de29"} Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.659731 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.778941 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779080 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779148 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779205 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779203 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779230 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69dnc\" (UniqueName: \"kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts\") pod \"63ef7708-cff0-4879-a502-a060e28042b0\" (UID: \"63ef7708-cff0-4879-a502-a060e28042b0\") " Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run" (OuterVolumeSpecName: "var-run") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.779994 4835 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.780013 4835 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.780024 4835 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/63ef7708-cff0-4879-a502-a060e28042b0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.780129 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts" (OuterVolumeSpecName: "scripts") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.786387 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc" (OuterVolumeSpecName: "kube-api-access-69dnc") pod "63ef7708-cff0-4879-a502-a060e28042b0" (UID: "63ef7708-cff0-4879-a502-a060e28042b0"). InnerVolumeSpecName "kube-api-access-69dnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.882068 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69dnc\" (UniqueName: \"kubernetes.io/projected/63ef7708-cff0-4879-a502-a060e28042b0-kube-api-access-69dnc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.882103 4835 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:53 crc kubenswrapper[4835]: I0319 09:46:53.882115 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63ef7708-cff0-4879-a502-a060e28042b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.271438 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-nfn4c-config-5kxtg" event={"ID":"63ef7708-cff0-4879-a502-a060e28042b0","Type":"ContainerDied","Data":"432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d"} Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.271789 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="432f7cf69d0bc08bd699a78fc8f8f0266365a969f7f9bde83523197d57df9a5d" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.271455 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-nfn4c-config-5kxtg" Mar 19 09:46:54 crc kubenswrapper[4835]: E0319 09:46:54.487324 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5614bb9c_3907_4c29_b148_fbca6c6642ad.slice/crio-6b7b668243420d44cad56f56d5c7018869002b4a0d38d2824542b484567ebdde.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.744148 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.746425 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-nfn4c-config-5kxtg"] Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.758917 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-nfn4c-config-5kxtg"] Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.810604 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data\") pod \"441782eb-a245-4484-bf4c-de0c77ca19c2\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.810700 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle\") pod \"441782eb-a245-4484-bf4c-de0c77ca19c2\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.810731 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data\") pod \"441782eb-a245-4484-bf4c-de0c77ca19c2\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.810809 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn6bs\" (UniqueName: \"kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs\") pod \"441782eb-a245-4484-bf4c-de0c77ca19c2\" (UID: \"441782eb-a245-4484-bf4c-de0c77ca19c2\") " Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.818943 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "441782eb-a245-4484-bf4c-de0c77ca19c2" (UID: "441782eb-a245-4484-bf4c-de0c77ca19c2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.830028 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs" (OuterVolumeSpecName: "kube-api-access-cn6bs") pod "441782eb-a245-4484-bf4c-de0c77ca19c2" (UID: "441782eb-a245-4484-bf4c-de0c77ca19c2"). InnerVolumeSpecName "kube-api-access-cn6bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.861014 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "441782eb-a245-4484-bf4c-de0c77ca19c2" (UID: "441782eb-a245-4484-bf4c-de0c77ca19c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.895074 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data" (OuterVolumeSpecName: "config-data") pod "441782eb-a245-4484-bf4c-de0c77ca19c2" (UID: "441782eb-a245-4484-bf4c-de0c77ca19c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.914148 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.914180 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn6bs\" (UniqueName: \"kubernetes.io/projected/441782eb-a245-4484-bf4c-de0c77ca19c2-kube-api-access-cn6bs\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.914190 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:54 crc kubenswrapper[4835]: I0319 09:46:54.914200 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/441782eb-a245-4484-bf4c-de0c77ca19c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.284697 4835 generic.go:334] "Generic (PLEG): container finished" podID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerID="6b7b668243420d44cad56f56d5c7018869002b4a0d38d2824542b484567ebdde" exitCode=0 Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.285029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerDied","Data":"6b7b668243420d44cad56f56d5c7018869002b4a0d38d2824542b484567ebdde"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.312061 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"b4a6ce59c24f91f8e736702a7b91eeed79ebfca732165125d534c5ad4e69e7be"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.312132 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"c600d52e87f8c7dc3d4ab625e274162dd2975b547c17509192e56eb3a98813b8"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.312153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"5566997bdc9840f6a19ad9cfd1fb4326a9d0b4043cc11eb5f12d07525c45008d"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.312166 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"7ecddcf3372103efe9f87dea33c31995e0278ff4aa6caa8ee3c68494de2e05c7"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.314173 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bp5cn" event={"ID":"441782eb-a245-4484-bf4c-de0c77ca19c2","Type":"ContainerDied","Data":"302d387ff7de73e079277ed6175bc697f1f7cbd29075bbf53c608fa8bdfddb96"} Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.314210 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302d387ff7de73e079277ed6175bc697f1f7cbd29075bbf53c608fa8bdfddb96" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.314265 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bp5cn" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.839290 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:55 crc kubenswrapper[4835]: E0319 09:46:55.840049 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ef7708-cff0-4879-a502-a060e28042b0" containerName="ovn-config" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.840066 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ef7708-cff0-4879-a502-a060e28042b0" containerName="ovn-config" Mar 19 09:46:55 crc kubenswrapper[4835]: E0319 09:46:55.840100 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441782eb-a245-4484-bf4c-de0c77ca19c2" containerName="glance-db-sync" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.840107 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="441782eb-a245-4484-bf4c-de0c77ca19c2" containerName="glance-db-sync" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.840505 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ef7708-cff0-4879-a502-a060e28042b0" containerName="ovn-config" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.840548 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="441782eb-a245-4484-bf4c-de0c77ca19c2" containerName="glance-db-sync" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.842964 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.912071 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.937348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.937401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.937423 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.937874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dw6\" (UniqueName: \"kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:55 crc kubenswrapper[4835]: I0319 09:46:55.937981 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.039637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.039809 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dw6\" (UniqueName: \"kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.039846 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.039887 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.039912 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.040963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.041120 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.041126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.041562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.061887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dw6\" (UniqueName: \"kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6\") pod \"dnsmasq-dns-5b946c75cc-wcnzw\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.314263 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.324722 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerStarted","Data":"b3bc97354dd751864fa557076e787e609a7e7b2c0436fc4b4767dd89f04470fc"} Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.330676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"acc8bd7650e04f40775cde221ee336b15327442e92e98f71f3fdde61ee38228a"} Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.330720 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"a91357167b29d1c922e5bbdb306701342e0e75f5471564ced11f314cc610f5e6"} Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.330730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"99792633-55f8-4a37-b7d8-ae770406c69d","Type":"ContainerStarted","Data":"d95bfc6b4735cfb190c3738547dfe4d5c0588950b73caf4c54b029b857541038"} Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.384471 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.478323454 podStartE2EDuration="41.384426733s" podCreationTimestamp="2026-03-19 09:46:15 +0000 UTC" firstStartedPulling="2026-03-19 09:46:49.229855666 +0000 UTC m=+1464.078454263" lastFinishedPulling="2026-03-19 09:46:54.135958965 +0000 UTC m=+1468.984557542" observedRunningTime="2026-03-19 09:46:56.369954918 +0000 UTC m=+1471.218553505" watchObservedRunningTime="2026-03-19 09:46:56.384426733 +0000 UTC m=+1471.233025320" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.416391 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ef7708-cff0-4879-a502-a060e28042b0" path="/var/lib/kubelet/pods/63ef7708-cff0-4879-a502-a060e28042b0/volumes" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.711633 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.730157 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.732657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.737944 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753241 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753281 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753363 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wnzg\" (UniqueName: \"kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.753647 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wnzg\" (UniqueName: \"kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855354 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855372 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855411 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855441 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.855471 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.856413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.856449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.856453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.856464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.856583 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.882524 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wnzg\" (UniqueName: \"kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg\") pod \"dnsmasq-dns-74f6bcbc87-rrsfb\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:56 crc kubenswrapper[4835]: I0319 09:46:56.888609 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:57 crc kubenswrapper[4835]: I0319 09:46:57.059916 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:46:57 crc kubenswrapper[4835]: I0319 09:46:57.345486 4835 generic.go:334] "Generic (PLEG): container finished" podID="0eaa61b1-a854-46c8-a126-6165e18df71a" containerID="f58cd32370d8c3a533aae0de9cc9862edaadb8902a68446c09cd7ec862d1ee98" exitCode=0 Mar 19 09:46:57 crc kubenswrapper[4835]: I0319 09:46:57.345839 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" event={"ID":"0eaa61b1-a854-46c8-a126-6165e18df71a","Type":"ContainerDied","Data":"f58cd32370d8c3a533aae0de9cc9862edaadb8902a68446c09cd7ec862d1ee98"} Mar 19 09:46:57 crc kubenswrapper[4835]: I0319 09:46:57.345898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" event={"ID":"0eaa61b1-a854-46c8-a126-6165e18df71a","Type":"ContainerStarted","Data":"cf9df909dc740af1b5c48499bab65e68937648dd7318d38a1922786c3c3bed2b"} Mar 19 09:46:57 crc kubenswrapper[4835]: I0319 09:46:57.588553 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.100950 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.288407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.288514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.288543 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.288811 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.288843 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8dw6\" (UniqueName: \"kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.394848 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.394914 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6" (OuterVolumeSpecName: "kube-api-access-t8dw6") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "kube-api-access-t8dw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.395064 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config" (OuterVolumeSpecName: "config") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.395095 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.396631 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") pod \"0eaa61b1-a854-46c8-a126-6165e18df71a\" (UID: \"0eaa61b1-a854-46c8-a126-6165e18df71a\") " Mar 19 09:46:58 crc kubenswrapper[4835]: W0319 09:46:58.397878 4835 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0eaa61b1-a854-46c8-a126-6165e18df71a/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.397908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.435418 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0eaa61b1-a854-46c8-a126-6165e18df71a" (UID: "0eaa61b1-a854-46c8-a126-6165e18df71a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.436845 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.437431 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.437553 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8dw6\" (UniqueName: \"kubernetes.io/projected/0eaa61b1-a854-46c8-a126-6165e18df71a-kube-api-access-t8dw6\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.437636 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.460456 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" event={"ID":"0eaa61b1-a854-46c8-a126-6165e18df71a","Type":"ContainerDied","Data":"cf9df909dc740af1b5c48499bab65e68937648dd7318d38a1922786c3c3bed2b"} Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.460525 4835 scope.go:117] "RemoveContainer" containerID="f58cd32370d8c3a533aae0de9cc9862edaadb8902a68446c09cd7ec862d1ee98" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.460694 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wcnzw" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.475245 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerStarted","Data":"d6cd8743ad520470e7f4f858eaaca7b3ca8abdf34b505b7aebf28cf03899d084"} Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.475377 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerStarted","Data":"8c3b4800b98a369b3a87a686b5c1694721c209e5128babafd807a662f5e39d22"} Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.542755 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0eaa61b1-a854-46c8-a126-6165e18df71a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.583797 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:58 crc kubenswrapper[4835]: I0319 09:46:58.589511 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wcnzw"] Mar 19 09:46:59 crc kubenswrapper[4835]: I0319 09:46:59.486945 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerStarted","Data":"37482d480d2ad8091eb70bdd4cf88e9d107bd3210cd8c777c9653ff6a120a665"} Mar 19 09:46:59 crc kubenswrapper[4835]: I0319 09:46:59.486983 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5614bb9c-3907-4c29-b148-fbca6c6642ad","Type":"ContainerStarted","Data":"8ce2996c795bbe4240d870eed02c89ccc54cf272d03491456334b52c1f327b32"} Mar 19 09:46:59 crc kubenswrapper[4835]: I0319 09:46:59.488406 4835 generic.go:334] "Generic (PLEG): container finished" podID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerID="d6cd8743ad520470e7f4f858eaaca7b3ca8abdf34b505b7aebf28cf03899d084" exitCode=0 Mar 19 09:46:59 crc kubenswrapper[4835]: I0319 09:46:59.488493 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerDied","Data":"d6cd8743ad520470e7f4f858eaaca7b3ca8abdf34b505b7aebf28cf03899d084"} Mar 19 09:46:59 crc kubenswrapper[4835]: I0319 09:46:59.525484 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.525438132 podStartE2EDuration="14.525438132s" podCreationTimestamp="2026-03-19 09:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:46:59.520233123 +0000 UTC m=+1474.368831720" watchObservedRunningTime="2026-03-19 09:46:59.525438132 +0000 UTC m=+1474.374036729" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.418855 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eaa61b1-a854-46c8-a126-6165e18df71a" path="/var/lib/kubelet/pods/0eaa61b1-a854-46c8-a126-6165e18df71a/volumes" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.428931 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.500820 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerStarted","Data":"6725e552866d61a67df83d48c088c8fc8fd24968468253d22c27509d1f40f7bf"} Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.501183 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.826254 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" podStartSLOduration=4.826236622 podStartE2EDuration="4.826236622s" podCreationTimestamp="2026-03-19 09:46:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:00.548349083 +0000 UTC m=+1475.396947680" watchObservedRunningTime="2026-03-19 09:47:00.826236622 +0000 UTC m=+1475.674835209" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.832707 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-2hr8p"] Mar 19 09:47:00 crc kubenswrapper[4835]: E0319 09:47:00.833199 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaa61b1-a854-46c8-a126-6165e18df71a" containerName="init" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.833241 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaa61b1-a854-46c8-a126-6165e18df71a" containerName="init" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.833449 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaa61b1-a854-46c8-a126-6165e18df71a" containerName="init" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.834218 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.849914 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.849951 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.853321 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-2hr8p"] Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.868443 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.963621 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v6pb2"] Mar 19 09:47:00 crc kubenswrapper[4835]: I0319 09:47:00.965593 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.009864 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.037075 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46jm\" (UniqueName: \"kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.052546 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-4f53-account-create-update-77jbh"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.055140 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.078112 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.133198 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v6pb2"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143070 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143127 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnqh\" (UniqueName: \"kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46jm\" (UniqueName: \"kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143278 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143319 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdppq\" (UniqueName: \"kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.143341 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.144562 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.170816 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4f53-account-create-update-77jbh"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.176621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46jm\" (UniqueName: \"kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm\") pod \"heat-db-create-2hr8p\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.243940 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.244033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdppq\" (UniqueName: \"kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.244076 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.244123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnqh\" (UniqueName: \"kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.245112 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.245280 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.264209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnqh\" (UniqueName: \"kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh\") pod \"heat-4f53-account-create-update-77jbh\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.272317 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdppq\" (UniqueName: \"kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq\") pod \"cinder-db-create-v6pb2\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.337025 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8a40-account-create-update-ql6kp"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.338514 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.343079 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.345356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.345457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmr5r\" (UniqueName: \"kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.351934 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8a40-account-create-update-ql6kp"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.379854 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.433239 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.441760 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jggwj"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.443678 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.446645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmr5r\" (UniqueName: \"kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.446839 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.447599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.452070 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jggwj"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.466671 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.521015 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmr5r\" (UniqueName: \"kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r\") pod \"cinder-8a40-account-create-update-ql6kp\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.551152 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.551328 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24r4h\" (UniqueName: \"kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.551999 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.555563 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5qg94"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.556904 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.575177 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.575495 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.575651 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.575831 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rp4k2" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.609002 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bd2xq"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.611494 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.635444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd2xq"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.652672 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24r4h\" (UniqueName: \"kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.652826 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.665016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.666404 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5qg94"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.685580 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-72fa-account-create-update-dj55b"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.687520 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.694379 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.696677 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.708966 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24r4h\" (UniqueName: \"kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h\") pod \"neutron-db-create-jggwj\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.750896 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-72fa-account-create-update-dj55b"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.768012 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhs99\" (UniqueName: \"kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.768116 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.768177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.768449 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprbm\" (UniqueName: \"kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.768524 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.791271 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ed21-account-create-update-5pvxc"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.799333 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.814441 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.839025 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ed21-account-create-update-5pvxc"] Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.850285 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871161 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqll\" (UniqueName: \"kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871288 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hprbm\" (UniqueName: \"kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871331 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhs99\" (UniqueName: \"kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871412 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.871454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.873210 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.875881 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.876118 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.889470 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhs99\" (UniqueName: \"kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99\") pod \"keystone-db-sync-5qg94\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.889820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprbm\" (UniqueName: \"kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm\") pod \"barbican-db-create-bd2xq\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.909127 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.973604 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr88l\" (UniqueName: \"kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.973870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.973925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqll\" (UniqueName: \"kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.973961 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.975377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:01 crc kubenswrapper[4835]: I0319 09:47:01.994515 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqll\" (UniqueName: \"kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll\") pod \"barbican-72fa-account-create-update-dj55b\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.014809 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.057170 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.076222 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr88l\" (UniqueName: \"kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.076463 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.077182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.095526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr88l\" (UniqueName: \"kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l\") pod \"neutron-ed21-account-create-update-5pvxc\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.133947 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-2hr8p"] Mar 19 09:47:02 crc kubenswrapper[4835]: W0319 09:47:02.162303 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09fbcd28_9104_4ddf_9e9f_8d7029c330bf.slice/crio-28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67 WatchSource:0}: Error finding container 28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67: Status 404 returned error can't find the container with id 28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67 Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.168257 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.168833 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v6pb2"] Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.340348 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-4f53-account-create-update-77jbh"] Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.543216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4f53-account-create-update-77jbh" event={"ID":"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64","Type":"ContainerStarted","Data":"274d9f51111cc9652f843e1ed52683449c947ca955da380211a2f8b501593ae7"} Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.551420 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2hr8p" event={"ID":"09fbcd28-9104-4ddf-9e9f-8d7029c330bf","Type":"ContainerStarted","Data":"28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67"} Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.554162 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v6pb2" event={"ID":"8c9a0780-2d49-40d7-83f9-2f6fcec39523","Type":"ContainerStarted","Data":"1d5426e75198df7b7f0de1f8530bb0f77ce33cd05c8feba2683137b5517afd40"} Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.633851 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jggwj"] Mar 19 09:47:02 crc kubenswrapper[4835]: I0319 09:47:02.658210 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8a40-account-create-update-ql6kp"] Mar 19 09:47:03 crc kubenswrapper[4835]: W0319 09:47:03.029689 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57dd95b0_04b3_4903_ac51_05ad58cefcf5.slice/crio-9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf WatchSource:0}: Error finding container 9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf: Status 404 returned error can't find the container with id 9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.051182 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5qg94"] Mar 19 09:47:03 crc kubenswrapper[4835]: W0319 09:47:03.087066 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd47e492d_e555_4219_92f5_ef1f36ad41c5.slice/crio-97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14 WatchSource:0}: Error finding container 97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14: Status 404 returned error can't find the container with id 97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14 Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.092375 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd2xq"] Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.119317 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-72fa-account-create-update-dj55b"] Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.134651 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ed21-account-create-update-5pvxc"] Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.588235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd2xq" event={"ID":"d47e492d-e555-4219-92f5-ef1f36ad41c5","Type":"ContainerStarted","Data":"97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14"} Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.592751 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5qg94" event={"ID":"57dd95b0-04b3-4903-ac51-05ad58cefcf5","Type":"ContainerStarted","Data":"9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf"} Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.593652 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a40-account-create-update-ql6kp" event={"ID":"0af06506-57b0-4b38-b57a-701b74ab2866","Type":"ContainerStarted","Data":"828bb9e66a9c228bf154c8564f13115cd23ffbc24c46e591e28283dbaa594d68"} Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.594600 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jggwj" event={"ID":"378b5128-0ef4-44b3-bfdd-0956f404c583","Type":"ContainerStarted","Data":"cfe696a3a85a73c1689eee17fea4d3f2ef233b6e433b704133db124498602014"} Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.595535 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed21-account-create-update-5pvxc" event={"ID":"5678f837-9cc0-4a1c-a32b-70b4e303c350","Type":"ContainerStarted","Data":"8f260e9b12b8904127bf605a3ed7418ebece6442fc4de56efe41ea1f1b3c473a"} Mar 19 09:47:03 crc kubenswrapper[4835]: I0319 09:47:03.596409 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-72fa-account-create-update-dj55b" event={"ID":"3e4d6aef-f764-455c-a076-cf225b70b33c","Type":"ContainerStarted","Data":"57e57c81f0dc59eab12ad5bb8049c6f5be8de07f9824b8c2acb897989f0e0d62"} Mar 19 09:47:04 crc kubenswrapper[4835]: E0319 09:47:04.537863 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bd1dfd6_8c2d_45cc_9bea_2b30b7e52f64.slice/crio-conmon-8839e0e09f529dd580390bd8bf319bf7cf31905425247c60454359b2a79530ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bd1dfd6_8c2d_45cc_9bea_2b30b7e52f64.slice/crio-8839e0e09f529dd580390bd8bf319bf7cf31905425247c60454359b2a79530ae.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.612643 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e4d6aef-f764-455c-a076-cf225b70b33c" containerID="79974ad04ee5703d2e03e6ccdf3b1d17e7c856e0344187a13e1f55ab40a2b9ac" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.612738 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-72fa-account-create-update-dj55b" event={"ID":"3e4d6aef-f764-455c-a076-cf225b70b33c","Type":"ContainerDied","Data":"79974ad04ee5703d2e03e6ccdf3b1d17e7c856e0344187a13e1f55ab40a2b9ac"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.616438 4835 generic.go:334] "Generic (PLEG): container finished" podID="d47e492d-e555-4219-92f5-ef1f36ad41c5" containerID="f5132bd050fd77a6f296b03946881f806e7287dad44e9e5c62ff57a965fa930d" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.616515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd2xq" event={"ID":"d47e492d-e555-4219-92f5-ef1f36ad41c5","Type":"ContainerDied","Data":"f5132bd050fd77a6f296b03946881f806e7287dad44e9e5c62ff57a965fa930d"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.618439 4835 generic.go:334] "Generic (PLEG): container finished" podID="8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" containerID="8839e0e09f529dd580390bd8bf319bf7cf31905425247c60454359b2a79530ae" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.618506 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4f53-account-create-update-77jbh" event={"ID":"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64","Type":"ContainerDied","Data":"8839e0e09f529dd580390bd8bf319bf7cf31905425247c60454359b2a79530ae"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.619780 4835 generic.go:334] "Generic (PLEG): container finished" podID="09fbcd28-9104-4ddf-9e9f-8d7029c330bf" containerID="bfdb3d9a3f546d0f8f972c5609f39ce5804f8773c23188edc67a7d9d445a9a34" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.619837 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2hr8p" event={"ID":"09fbcd28-9104-4ddf-9e9f-8d7029c330bf","Type":"ContainerDied","Data":"bfdb3d9a3f546d0f8f972c5609f39ce5804f8773c23188edc67a7d9d445a9a34"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.635499 4835 generic.go:334] "Generic (PLEG): container finished" podID="0af06506-57b0-4b38-b57a-701b74ab2866" containerID="8a4c39f9b4e73c41b323d40cb52f3e574c298f35d66a42754a7fde2dc612feff" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.635581 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a40-account-create-update-ql6kp" event={"ID":"0af06506-57b0-4b38-b57a-701b74ab2866","Type":"ContainerDied","Data":"8a4c39f9b4e73c41b323d40cb52f3e574c298f35d66a42754a7fde2dc612feff"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.638468 4835 generic.go:334] "Generic (PLEG): container finished" podID="378b5128-0ef4-44b3-bfdd-0956f404c583" containerID="a5971122082f5207f6845a0c0f278ea5f990d404320e26b84dfb5aa85ce6859c" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.638538 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jggwj" event={"ID":"378b5128-0ef4-44b3-bfdd-0956f404c583","Type":"ContainerDied","Data":"a5971122082f5207f6845a0c0f278ea5f990d404320e26b84dfb5aa85ce6859c"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.640682 4835 generic.go:334] "Generic (PLEG): container finished" podID="8c9a0780-2d49-40d7-83f9-2f6fcec39523" containerID="e9dfbcd95c4d6eb4fc24553b5b14b0d8caa895d497ddeddfdea725fecb9e3d03" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.640805 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v6pb2" event={"ID":"8c9a0780-2d49-40d7-83f9-2f6fcec39523","Type":"ContainerDied","Data":"e9dfbcd95c4d6eb4fc24553b5b14b0d8caa895d497ddeddfdea725fecb9e3d03"} Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.642216 4835 generic.go:334] "Generic (PLEG): container finished" podID="5678f837-9cc0-4a1c-a32b-70b4e303c350" containerID="3f20636aa9d4ccf58558d41609cb16e89c7d7be9c3d5449f9069346ed04d8059" exitCode=0 Mar 19 09:47:04 crc kubenswrapper[4835]: I0319 09:47:04.642257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed21-account-create-update-5pvxc" event={"ID":"5678f837-9cc0-4a1c-a32b-70b4e303c350","Type":"ContainerDied","Data":"3f20636aa9d4ccf58558d41609cb16e89c7d7be9c3d5449f9069346ed04d8059"} Mar 19 09:47:07 crc kubenswrapper[4835]: I0319 09:47:07.061968 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:47:07 crc kubenswrapper[4835]: I0319 09:47:07.196869 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:47:07 crc kubenswrapper[4835]: I0319 09:47:07.197089 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-z5zlv" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="dnsmasq-dns" containerID="cri-o://53b32e6cbfdb2061cf92d560e23b64c9dfe6812bf0ff8e5fe13da0eb152e91dc" gracePeriod=10 Mar 19 09:47:07 crc kubenswrapper[4835]: I0319 09:47:07.678937 4835 generic.go:334] "Generic (PLEG): container finished" podID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerID="53b32e6cbfdb2061cf92d560e23b64c9dfe6812bf0ff8e5fe13da0eb152e91dc" exitCode=0 Mar 19 09:47:07 crc kubenswrapper[4835]: I0319 09:47:07.679029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z5zlv" event={"ID":"8825e955-0683-4e69-ae3d-6bddcc9c92e1","Type":"ContainerDied","Data":"53b32e6cbfdb2061cf92d560e23b64c9dfe6812bf0ff8e5fe13da0eb152e91dc"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.110078 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.123590 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.136459 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.158967 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.176236 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.181333 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.195554 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts\") pod \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.195882 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdppq\" (UniqueName: \"kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq\") pod \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\" (UID: \"8c9a0780-2d49-40d7-83f9-2f6fcec39523\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.197395 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c9a0780-2d49-40d7-83f9-2f6fcec39523" (UID: "8c9a0780-2d49-40d7-83f9-2f6fcec39523"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.216037 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq" (OuterVolumeSpecName: "kube-api-access-fdppq") pod "8c9a0780-2d49-40d7-83f9-2f6fcec39523" (UID: "8c9a0780-2d49-40d7-83f9-2f6fcec39523"). InnerVolumeSpecName "kube-api-access-fdppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.216541 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.220195 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297271 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24r4h\" (UniqueName: \"kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h\") pod \"378b5128-0ef4-44b3-bfdd-0956f404c583\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297328 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts\") pod \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297366 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46jm\" (UniqueName: \"kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm\") pod \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297411 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmr5r\" (UniqueName: \"kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r\") pod \"0af06506-57b0-4b38-b57a-701b74ab2866\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297436 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqll\" (UniqueName: \"kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll\") pod \"3e4d6aef-f764-455c-a076-cf225b70b33c\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297481 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mnqh\" (UniqueName: \"kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh\") pod \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\" (UID: \"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297515 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts\") pod \"0af06506-57b0-4b38-b57a-701b74ab2866\" (UID: \"0af06506-57b0-4b38-b57a-701b74ab2866\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297530 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts\") pod \"d47e492d-e555-4219-92f5-ef1f36ad41c5\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297564 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts\") pod \"5678f837-9cc0-4a1c-a32b-70b4e303c350\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297579 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts\") pod \"378b5128-0ef4-44b3-bfdd-0956f404c583\" (UID: \"378b5128-0ef4-44b3-bfdd-0956f404c583\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297617 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr88l\" (UniqueName: \"kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l\") pod \"5678f837-9cc0-4a1c-a32b-70b4e303c350\" (UID: \"5678f837-9cc0-4a1c-a32b-70b4e303c350\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297684 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hprbm\" (UniqueName: \"kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm\") pod \"d47e492d-e555-4219-92f5-ef1f36ad41c5\" (UID: \"d47e492d-e555-4219-92f5-ef1f36ad41c5\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297704 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts\") pod \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\" (UID: \"09fbcd28-9104-4ddf-9e9f-8d7029c330bf\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.297724 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts\") pod \"3e4d6aef-f764-455c-a076-cf225b70b33c\" (UID: \"3e4d6aef-f764-455c-a076-cf225b70b33c\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.298184 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c9a0780-2d49-40d7-83f9-2f6fcec39523-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.298203 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdppq\" (UniqueName: \"kubernetes.io/projected/8c9a0780-2d49-40d7-83f9-2f6fcec39523-kube-api-access-fdppq\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.298777 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e4d6aef-f764-455c-a076-cf225b70b33c" (UID: "3e4d6aef-f764-455c-a076-cf225b70b33c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.299325 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "378b5128-0ef4-44b3-bfdd-0956f404c583" (UID: "378b5128-0ef4-44b3-bfdd-0956f404c583"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.299353 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0af06506-57b0-4b38-b57a-701b74ab2866" (UID: "0af06506-57b0-4b38-b57a-701b74ab2866"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.299417 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" (UID: "8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.299638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d47e492d-e555-4219-92f5-ef1f36ad41c5" (UID: "d47e492d-e555-4219-92f5-ef1f36ad41c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.299966 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5678f837-9cc0-4a1c-a32b-70b4e303c350" (UID: "5678f837-9cc0-4a1c-a32b-70b4e303c350"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.300097 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09fbcd28-9104-4ddf-9e9f-8d7029c330bf" (UID: "09fbcd28-9104-4ddf-9e9f-8d7029c330bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.310505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r" (OuterVolumeSpecName: "kube-api-access-gmr5r") pod "0af06506-57b0-4b38-b57a-701b74ab2866" (UID: "0af06506-57b0-4b38-b57a-701b74ab2866"). InnerVolumeSpecName "kube-api-access-gmr5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.310560 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l" (OuterVolumeSpecName: "kube-api-access-gr88l") pod "5678f837-9cc0-4a1c-a32b-70b4e303c350" (UID: "5678f837-9cc0-4a1c-a32b-70b4e303c350"). InnerVolumeSpecName "kube-api-access-gr88l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.310554 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm" (OuterVolumeSpecName: "kube-api-access-hprbm") pod "d47e492d-e555-4219-92f5-ef1f36ad41c5" (UID: "d47e492d-e555-4219-92f5-ef1f36ad41c5"). InnerVolumeSpecName "kube-api-access-hprbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.310776 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h" (OuterVolumeSpecName: "kube-api-access-24r4h") pod "378b5128-0ef4-44b3-bfdd-0956f404c583" (UID: "378b5128-0ef4-44b3-bfdd-0956f404c583"). InnerVolumeSpecName "kube-api-access-24r4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.311357 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll" (OuterVolumeSpecName: "kube-api-access-2dqll") pod "3e4d6aef-f764-455c-a076-cf225b70b33c" (UID: "3e4d6aef-f764-455c-a076-cf225b70b33c"). InnerVolumeSpecName "kube-api-access-2dqll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.311633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm" (OuterVolumeSpecName: "kube-api-access-t46jm") pod "09fbcd28-9104-4ddf-9e9f-8d7029c330bf" (UID: "09fbcd28-9104-4ddf-9e9f-8d7029c330bf"). InnerVolumeSpecName "kube-api-access-t46jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.314054 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh" (OuterVolumeSpecName: "kube-api-access-8mnqh") pod "8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" (UID: "8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64"). InnerVolumeSpecName "kube-api-access-8mnqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400453 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmr5r\" (UniqueName: \"kubernetes.io/projected/0af06506-57b0-4b38-b57a-701b74ab2866-kube-api-access-gmr5r\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400488 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqll\" (UniqueName: \"kubernetes.io/projected/3e4d6aef-f764-455c-a076-cf225b70b33c-kube-api-access-2dqll\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400498 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mnqh\" (UniqueName: \"kubernetes.io/projected/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-kube-api-access-8mnqh\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400508 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af06506-57b0-4b38-b57a-701b74ab2866-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400516 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d47e492d-e555-4219-92f5-ef1f36ad41c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400524 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678f837-9cc0-4a1c-a32b-70b4e303c350-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400533 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/378b5128-0ef4-44b3-bfdd-0956f404c583-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400542 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr88l\" (UniqueName: \"kubernetes.io/projected/5678f837-9cc0-4a1c-a32b-70b4e303c350-kube-api-access-gr88l\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400550 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hprbm\" (UniqueName: \"kubernetes.io/projected/d47e492d-e555-4219-92f5-ef1f36ad41c5-kube-api-access-hprbm\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400558 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400566 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e4d6aef-f764-455c-a076-cf225b70b33c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400574 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24r4h\" (UniqueName: \"kubernetes.io/projected/378b5128-0ef4-44b3-bfdd-0956f404c583-kube-api-access-24r4h\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400582 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.400590 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46jm\" (UniqueName: \"kubernetes.io/projected/09fbcd28-9104-4ddf-9e9f-8d7029c330bf-kube-api-access-t46jm\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.460399 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.630286 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb\") pod \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.630469 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb\") pod \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.630612 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbmsx\" (UniqueName: \"kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx\") pod \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.630761 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc\") pod \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.631026 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config\") pod \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\" (UID: \"8825e955-0683-4e69-ae3d-6bddcc9c92e1\") " Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.635640 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx" (OuterVolumeSpecName: "kube-api-access-jbmsx") pod "8825e955-0683-4e69-ae3d-6bddcc9c92e1" (UID: "8825e955-0683-4e69-ae3d-6bddcc9c92e1"). InnerVolumeSpecName "kube-api-access-jbmsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.682851 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8825e955-0683-4e69-ae3d-6bddcc9c92e1" (UID: "8825e955-0683-4e69-ae3d-6bddcc9c92e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.694848 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8825e955-0683-4e69-ae3d-6bddcc9c92e1" (UID: "8825e955-0683-4e69-ae3d-6bddcc9c92e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.698934 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8825e955-0683-4e69-ae3d-6bddcc9c92e1" (UID: "8825e955-0683-4e69-ae3d-6bddcc9c92e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.699833 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config" (OuterVolumeSpecName: "config") pod "8825e955-0683-4e69-ae3d-6bddcc9c92e1" (UID: "8825e955-0683-4e69-ae3d-6bddcc9c92e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.702962 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-2hr8p" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.702958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-2hr8p" event={"ID":"09fbcd28-9104-4ddf-9e9f-8d7029c330bf","Type":"ContainerDied","Data":"28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.703084 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28dfb217470836130459a3f5134c8455327461a3f9be5c80655528841fdffc67" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.704726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8a40-account-create-update-ql6kp" event={"ID":"0af06506-57b0-4b38-b57a-701b74ab2866","Type":"ContainerDied","Data":"828bb9e66a9c228bf154c8564f13115cd23ffbc24c46e591e28283dbaa594d68"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.704783 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="828bb9e66a9c228bf154c8564f13115cd23ffbc24c46e591e28283dbaa594d68" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.704826 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8a40-account-create-update-ql6kp" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.707299 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed21-account-create-update-5pvxc" event={"ID":"5678f837-9cc0-4a1c-a32b-70b4e303c350","Type":"ContainerDied","Data":"8f260e9b12b8904127bf605a3ed7418ebece6442fc4de56efe41ea1f1b3c473a"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.707336 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f260e9b12b8904127bf605a3ed7418ebece6442fc4de56efe41ea1f1b3c473a" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.707315 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed21-account-create-update-5pvxc" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.711442 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jggwj" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.711519 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jggwj" event={"ID":"378b5128-0ef4-44b3-bfdd-0956f404c583","Type":"ContainerDied","Data":"cfe696a3a85a73c1689eee17fea4d3f2ef233b6e433b704133db124498602014"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.711549 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe696a3a85a73c1689eee17fea4d3f2ef233b6e433b704133db124498602014" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.713186 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v6pb2" event={"ID":"8c9a0780-2d49-40d7-83f9-2f6fcec39523","Type":"ContainerDied","Data":"1d5426e75198df7b7f0de1f8530bb0f77ce33cd05c8feba2683137b5517afd40"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.713214 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5426e75198df7b7f0de1f8530bb0f77ce33cd05c8feba2683137b5517afd40" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.713250 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v6pb2" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.715263 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-72fa-account-create-update-dj55b" event={"ID":"3e4d6aef-f764-455c-a076-cf225b70b33c","Type":"ContainerDied","Data":"57e57c81f0dc59eab12ad5bb8049c6f5be8de07f9824b8c2acb897989f0e0d62"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.715315 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57e57c81f0dc59eab12ad5bb8049c6f5be8de07f9824b8c2acb897989f0e0d62" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.715284 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-72fa-account-create-update-dj55b" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.716622 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd2xq" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.716616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd2xq" event={"ID":"d47e492d-e555-4219-92f5-ef1f36ad41c5","Type":"ContainerDied","Data":"97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.716754 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97906516adb8c437fb0dc2cfe47b49dac22ef5ddb8a1e6ab5f56b4468890ed14" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.718230 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-z5zlv" event={"ID":"8825e955-0683-4e69-ae3d-6bddcc9c92e1","Type":"ContainerDied","Data":"d31d56e413bda3809e531f94b4c607790a8ae6880fd1b1425613eea0a35ea742"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.718288 4835 scope.go:117] "RemoveContainer" containerID="53b32e6cbfdb2061cf92d560e23b64c9dfe6812bf0ff8e5fe13da0eb152e91dc" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.720126 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-z5zlv" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.721311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-4f53-account-create-update-77jbh" event={"ID":"8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64","Type":"ContainerDied","Data":"274d9f51111cc9652f843e1ed52683449c947ca955da380211a2f8b501593ae7"} Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.721363 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="274d9f51111cc9652f843e1ed52683449c947ca955da380211a2f8b501593ae7" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.721345 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-4f53-account-create-update-77jbh" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.734781 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbmsx\" (UniqueName: \"kubernetes.io/projected/8825e955-0683-4e69-ae3d-6bddcc9c92e1-kube-api-access-jbmsx\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.734808 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.734818 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.734827 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.734834 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8825e955-0683-4e69-ae3d-6bddcc9c92e1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.746311 4835 scope.go:117] "RemoveContainer" containerID="9d75b5baa61382fa1fc389d72195aaceae51ebdb619f9d6626acd57a0d20ae70" Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.802949 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:47:09 crc kubenswrapper[4835]: I0319 09:47:09.811584 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-z5zlv"] Mar 19 09:47:10 crc kubenswrapper[4835]: I0319 09:47:10.414700 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" path="/var/lib/kubelet/pods/8825e955-0683-4e69-ae3d-6bddcc9c92e1/volumes" Mar 19 09:47:10 crc kubenswrapper[4835]: I0319 09:47:10.735014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5qg94" event={"ID":"57dd95b0-04b3-4903-ac51-05ad58cefcf5","Type":"ContainerStarted","Data":"0755e58c2018ed18c4e26fc17ed86fd63a6214a090aa486a80de7e59c3d22bad"} Mar 19 09:47:10 crc kubenswrapper[4835]: I0319 09:47:10.765805 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5qg94" podStartSLOduration=3.218007077 podStartE2EDuration="9.76578448s" podCreationTimestamp="2026-03-19 09:47:01 +0000 UTC" firstStartedPulling="2026-03-19 09:47:03.048856751 +0000 UTC m=+1477.897455338" lastFinishedPulling="2026-03-19 09:47:09.596634154 +0000 UTC m=+1484.445232741" observedRunningTime="2026-03-19 09:47:10.759226834 +0000 UTC m=+1485.607825441" watchObservedRunningTime="2026-03-19 09:47:10.76578448 +0000 UTC m=+1485.614383077" Mar 19 09:47:13 crc kubenswrapper[4835]: I0319 09:47:13.782553 4835 generic.go:334] "Generic (PLEG): container finished" podID="57dd95b0-04b3-4903-ac51-05ad58cefcf5" containerID="0755e58c2018ed18c4e26fc17ed86fd63a6214a090aa486a80de7e59c3d22bad" exitCode=0 Mar 19 09:47:13 crc kubenswrapper[4835]: I0319 09:47:13.782633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5qg94" event={"ID":"57dd95b0-04b3-4903-ac51-05ad58cefcf5","Type":"ContainerDied","Data":"0755e58c2018ed18c4e26fc17ed86fd63a6214a090aa486a80de7e59c3d22bad"} Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.178275 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.371117 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle\") pod \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.371514 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data\") pod \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.371630 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhs99\" (UniqueName: \"kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99\") pod \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\" (UID: \"57dd95b0-04b3-4903-ac51-05ad58cefcf5\") " Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.379531 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99" (OuterVolumeSpecName: "kube-api-access-dhs99") pod "57dd95b0-04b3-4903-ac51-05ad58cefcf5" (UID: "57dd95b0-04b3-4903-ac51-05ad58cefcf5"). InnerVolumeSpecName "kube-api-access-dhs99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.408367 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57dd95b0-04b3-4903-ac51-05ad58cefcf5" (UID: "57dd95b0-04b3-4903-ac51-05ad58cefcf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.428717 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data" (OuterVolumeSpecName: "config-data") pod "57dd95b0-04b3-4903-ac51-05ad58cefcf5" (UID: "57dd95b0-04b3-4903-ac51-05ad58cefcf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.473792 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.473822 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57dd95b0-04b3-4903-ac51-05ad58cefcf5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.473832 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhs99\" (UniqueName: \"kubernetes.io/projected/57dd95b0-04b3-4903-ac51-05ad58cefcf5-kube-api-access-dhs99\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.804468 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5qg94" event={"ID":"57dd95b0-04b3-4903-ac51-05ad58cefcf5","Type":"ContainerDied","Data":"9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf"} Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.804507 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9233f8b84de6ce0c149f7aa3178b1cbf0cfcda2daa25d412d5b00a7af924fccf" Mar 19 09:47:15 crc kubenswrapper[4835]: I0319 09:47:15.804532 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5qg94" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.098869 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c9nxs"] Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099453 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099483 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099494 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47e492d-e555-4219-92f5-ef1f36ad41c5" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099502 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47e492d-e555-4219-92f5-ef1f36ad41c5" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099519 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9a0780-2d49-40d7-83f9-2f6fcec39523" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099529 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9a0780-2d49-40d7-83f9-2f6fcec39523" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099549 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="378b5128-0ef4-44b3-bfdd-0956f404c583" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099557 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="378b5128-0ef4-44b3-bfdd-0956f404c583" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099573 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="init" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099580 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="init" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099592 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af06506-57b0-4b38-b57a-701b74ab2866" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099599 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af06506-57b0-4b38-b57a-701b74ab2866" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099616 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5678f837-9cc0-4a1c-a32b-70b4e303c350" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099624 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5678f837-9cc0-4a1c-a32b-70b4e303c350" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099642 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57dd95b0-04b3-4903-ac51-05ad58cefcf5" containerName="keystone-db-sync" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099650 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="57dd95b0-04b3-4903-ac51-05ad58cefcf5" containerName="keystone-db-sync" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099670 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="dnsmasq-dns" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099678 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="dnsmasq-dns" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099697 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09fbcd28-9104-4ddf-9e9f-8d7029c330bf" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099705 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbcd28-9104-4ddf-9e9f-8d7029c330bf" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: E0319 09:47:16.099718 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4d6aef-f764-455c-a076-cf225b70b33c" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.099725 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4d6aef-f764-455c-a076-cf225b70b33c" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100044 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af06506-57b0-4b38-b57a-701b74ab2866" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100067 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="09fbcd28-9104-4ddf-9e9f-8d7029c330bf" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100078 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c9a0780-2d49-40d7-83f9-2f6fcec39523" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100094 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="57dd95b0-04b3-4903-ac51-05ad58cefcf5" containerName="keystone-db-sync" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100111 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5678f837-9cc0-4a1c-a32b-70b4e303c350" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100121 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d47e492d-e555-4219-92f5-ef1f36ad41c5" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100135 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8825e955-0683-4e69-ae3d-6bddcc9c92e1" containerName="dnsmasq-dns" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100147 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4d6aef-f764-455c-a076-cf225b70b33c" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100158 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" containerName="mariadb-account-create-update" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.100168 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="378b5128-0ef4-44b3-bfdd-0956f404c583" containerName="mariadb-database-create" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.101153 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.104947 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.105296 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.113385 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.113581 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.116294 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rp4k2" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.134571 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.136502 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.149362 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c9nxs"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.222092 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.302642 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.302760 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhm7h\" (UniqueName: \"kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.302883 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303024 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303066 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303131 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303175 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303231 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303270 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303301 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdprf\" (UniqueName: \"kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303335 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.303413 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.341487 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-4jmpv"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.343692 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.362681 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4jmpv"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.371265 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.371965 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-4xtcm" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.412948 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.412996 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413058 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413095 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413117 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdprf\" (UniqueName: \"kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413199 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413216 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413244 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhm7h\" (UniqueName: \"kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.413278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.414166 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.415016 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.415607 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.418429 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.423019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.423215 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.424483 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.451623 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.454559 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.457134 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.462516 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhm7h\" (UniqueName: \"kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h\") pod \"keystone-bootstrap-c9nxs\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.466397 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdprf\" (UniqueName: \"kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf\") pod \"dnsmasq-dns-847c4cc679-j8vqq\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.469806 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mkr8b"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.471256 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.477725 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.478337 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.479581 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.488355 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mkr8b"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.491499 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z9xjq" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.515001 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgbf2\" (UniqueName: \"kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.515105 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.515148 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617322 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617408 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgbf2\" (UniqueName: \"kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617546 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsvr\" (UniqueName: \"kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617605 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.617683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.624155 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-c5lzw"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.625491 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.640982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.644201 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.644624 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.644805 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-55tz7" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.668377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.682820 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-c5lzw"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.687552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgbf2\" (UniqueName: \"kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2\") pod \"heat-db-sync-4jmpv\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.688174 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4jmpv" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719038 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719107 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719146 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719194 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsvr\" (UniqueName: \"kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719219 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719424 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.719465 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht49\" (UniqueName: \"kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.725577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.736942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.744117 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.744217 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.744874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.745918 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.748499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.770110 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsvr\" (UniqueName: \"kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr\") pod \"cinder-db-sync-mkr8b\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.789696 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2rdm5"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.791143 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.797326 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n69wh" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.799389 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.800083 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.803503 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.827252 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht49\" (UniqueName: \"kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.827381 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.827481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.843319 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.849468 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.891815 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2rdm5"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.928141 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht49\" (UniqueName: \"kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49\") pod \"neutron-db-sync-c5lzw\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.930134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.930179 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.930255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpmrv\" (UniqueName: \"kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.930323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.930415 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.981288 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:47:16 crc kubenswrapper[4835]: I0319 09:47:16.983217 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.003496 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mw6s2"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.006849 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.012878 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.013071 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-966xp" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.014979 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.028307 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mw6s2"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.034859 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.034961 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.035078 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.035103 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.035166 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpmrv\" (UniqueName: \"kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.035530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.040619 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.051776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.055701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.057666 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.069486 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.070438 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.074762 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpmrv\" (UniqueName: \"kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv\") pod \"placement-db-sync-2rdm5\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.087196 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.087342 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137348 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s5np\" (UniqueName: \"kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137427 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137458 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137512 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137553 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137593 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137798 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.137976 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4wng\" (UniqueName: \"kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138034 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138158 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138237 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138360 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138402 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5q4d\" (UniqueName: \"kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138437 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.138510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.166185 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.184599 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241444 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s5np\" (UniqueName: \"kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241556 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241596 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241628 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241656 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241679 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241705 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4wng\" (UniqueName: \"kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241825 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241849 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241900 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241919 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5q4d\" (UniqueName: \"kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.241941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.242081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.242811 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.244462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.244599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.245046 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.245229 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.245796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.247199 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.248930 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.249516 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.250371 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.251543 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.258411 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.260519 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.267127 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.267571 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.271772 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.273696 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.273902 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2c5rl" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.280474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4wng\" (UniqueName: \"kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng\") pod \"dnsmasq-dns-785d8bcb8c-lwf7t\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.282410 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.284413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s5np\" (UniqueName: \"kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np\") pod \"barbican-db-sync-mw6s2\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.289081 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5q4d\" (UniqueName: \"kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d\") pod \"ceilometer-0\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.343958 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.344689 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtpw\" (UniqueName: \"kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.344725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345020 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345074 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345106 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345297 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.345461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.357935 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.439308 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447032 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447080 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447129 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtpw\" (UniqueName: \"kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447237 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447276 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.447293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.456677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.457765 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.462156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.467249 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.467298 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d183d018680155306ca255d36c5357de351d0f1f91a4bb616234d2bb906544a/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.469546 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.473345 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.476324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.490436 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.505545 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtpw\" (UniqueName: \"kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.512361 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.516525 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.516836 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.549932 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh44p\" (UniqueName: \"kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550067 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550145 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550365 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.550591 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.597188 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.653832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.655966 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.656322 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh44p\" (UniqueName: \"kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.656449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.656539 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.656752 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.656911 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.657128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.657156 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.657886 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.664029 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.664068 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89205a9f8fafa059e3c8d5aabb19b28af3510e2ed70a778ed91ec4c662cc8e20/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.668827 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.672657 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.688834 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.689495 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.703284 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.725506 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.729638 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh44p\" (UniqueName: \"kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.739690 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-4jmpv"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.788613 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c9nxs"] Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.798876 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: W0319 09:47:17.831271 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89dd21d1_34a1_4d91_a7cb_32840eda818e.slice/crio-9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834 WatchSource:0}: Error finding container 9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834: Status 404 returned error can't find the container with id 9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834 Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.892413 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.908331 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.930126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" event={"ID":"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe","Type":"ContainerStarted","Data":"e109c41cd5f12dd3e04d7e004534f88a7214220158d5493b4a89e9a5ee621563"} Mar 19 09:47:17 crc kubenswrapper[4835]: I0319 09:47:17.932433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4jmpv" event={"ID":"89dd21d1-34a1-4d91-a7cb-32840eda818e","Type":"ContainerStarted","Data":"9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.043826 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mkr8b"] Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.287297 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-c5lzw"] Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.300815 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2rdm5"] Mar 19 09:47:18 crc kubenswrapper[4835]: W0319 09:47:18.303897 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf7c073_aa50_4b3b_aedb_7e77be63f85a.slice/crio-ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c WatchSource:0}: Error finding container ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c: Status 404 returned error can't find the container with id ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.310578 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mw6s2"] Mar 19 09:47:18 crc kubenswrapper[4835]: W0319 09:47:18.326590 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777ab445_6d41_49a9_b87c_67ed7503cc5b.slice/crio-d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c WatchSource:0}: Error finding container d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c: Status 404 returned error can't find the container with id d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c Mar 19 09:47:18 crc kubenswrapper[4835]: W0319 09:47:18.346671 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod410b03bf_d2bc_4992_b467_db7947c52078.slice/crio-546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83 WatchSource:0}: Error finding container 546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83: Status 404 returned error can't find the container with id 546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83 Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.515813 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:47:18 crc kubenswrapper[4835]: W0319 09:47:18.556069 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe4973bb_df39_4221_bbd0_f637a10dd5b0.slice/crio-1dc58df72f68eeb61c007fc15e07f18577ce9c8e16ce4d2db078538f01b1b602 WatchSource:0}: Error finding container 1dc58df72f68eeb61c007fc15e07f18577ce9c8e16ce4d2db078538f01b1b602: Status 404 returned error can't find the container with id 1dc58df72f68eeb61c007fc15e07f18577ce9c8e16ce4d2db078538f01b1b602 Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.916701 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.961252 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" event={"ID":"fe4973bb-df39-4221-bbd0-f637a10dd5b0","Type":"ContainerStarted","Data":"1dc58df72f68eeb61c007fc15e07f18577ce9c8e16ce4d2db078538f01b1b602"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.963971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c9nxs" event={"ID":"8678740e-623f-4062-8b98-eba5fe57f88e","Type":"ContainerStarted","Data":"091c3a8c428655cd1829849fef315dfcc00ba127038164eb8bce08bf31962482"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.964008 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c9nxs" event={"ID":"8678740e-623f-4062-8b98-eba5fe57f88e","Type":"ContainerStarted","Data":"6ce2c2ead237b71c6ccf7d118f3660b174836699d45ea073fcb12b5f84861790"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.972874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c5lzw" event={"ID":"cdf7c073-aa50-4b3b-aedb-7e77be63f85a","Type":"ContainerStarted","Data":"5892f1f672103d6dcb8484f3156fe6192a7117db7deab0d80d609b3dba5c4dfb"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.972918 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c5lzw" event={"ID":"cdf7c073-aa50-4b3b-aedb-7e77be63f85a","Type":"ContainerStarted","Data":"ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.980647 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mkr8b" event={"ID":"2161ebbe-ed84-49a9-b0b1-b74304fd8b28","Type":"ContainerStarted","Data":"cb53017e265d6e0ef886b00800733b13bf896f0bfdfdb47fa62e9940428c2938"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.987034 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2rdm5" event={"ID":"777ab445-6d41-49a9-b87c-67ed7503cc5b","Type":"ContainerStarted","Data":"d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.990695 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c9nxs" podStartSLOduration=2.99067564 podStartE2EDuration="2.99067564s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:18.983365375 +0000 UTC m=+1493.831963962" watchObservedRunningTime="2026-03-19 09:47:18.99067564 +0000 UTC m=+1493.839274227" Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.993947 4835 generic.go:334] "Generic (PLEG): container finished" podID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" containerID="b70e7e9a25a3b381640732199f60f686771bc13c098b1d819ed7fab8e85d7a2c" exitCode=0 Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.994051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" event={"ID":"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe","Type":"ContainerDied","Data":"b70e7e9a25a3b381640732199f60f686771bc13c098b1d819ed7fab8e85d7a2c"} Mar 19 09:47:18 crc kubenswrapper[4835]: I0319 09:47:18.995934 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mw6s2" event={"ID":"410b03bf-d2bc-4992-b467-db7947c52078","Type":"ContainerStarted","Data":"546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83"} Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.041264 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-c5lzw" podStartSLOduration=3.041246766 podStartE2EDuration="3.041246766s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:19.008295089 +0000 UTC m=+1493.856893686" watchObservedRunningTime="2026-03-19 09:47:19.041246766 +0000 UTC m=+1493.889845343" Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.056722 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:19 crc kubenswrapper[4835]: W0319 09:47:19.102897 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16f10f93_a979_44fb_92e3_3b96b5f9354e.slice/crio-f59ee6a2c7f4845f43efa0e5d914a9883ddd95039fefe3eb6499003ce3cbbf3a WatchSource:0}: Error finding container f59ee6a2c7f4845f43efa0e5d914a9883ddd95039fefe3eb6499003ce3cbbf3a: Status 404 returned error can't find the container with id f59ee6a2c7f4845f43efa0e5d914a9883ddd95039fefe3eb6499003ce3cbbf3a Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.710996 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.867967 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.868202 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.868242 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.868266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdprf\" (UniqueName: \"kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.868290 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.868356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb\") pod \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\" (UID: \"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe\") " Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.972190 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf" (OuterVolumeSpecName: "kube-api-access-rdprf") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "kube-api-access-rdprf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.990194 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config" (OuterVolumeSpecName: "config") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:19 crc kubenswrapper[4835]: I0319 09:47:19.990780 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.037334 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.044257 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.069730 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.079030 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.091825 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.092068 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdprf\" (UniqueName: \"kubernetes.io/projected/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-kube-api-access-rdprf\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.092143 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.079073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" event={"ID":"2f82c9c2-4f39-412a-8a03-4a463b4b4bfe","Type":"ContainerDied","Data":"e109c41cd5f12dd3e04d7e004534f88a7214220158d5493b4a89e9a5ee621563"} Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.092340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerStarted","Data":"f59ee6a2c7f4845f43efa0e5d914a9883ddd95039fefe3eb6499003ce3cbbf3a"} Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.085820 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.079150 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.092853 4835 scope.go:117] "RemoveContainer" containerID="b70e7e9a25a3b381640732199f60f686771bc13c098b1d819ed7fab8e85d7a2c" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.111417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerStarted","Data":"ae723b8a930ff1e5cc46088c43ce47275f464a69fce7922deed8b55ba0174ca9"} Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.121356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" (UID: "2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.121459 4835 generic.go:334] "Generic (PLEG): container finished" podID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerID="940fc3ad402a4fa86a009f90dfa613197e63ae02d9eae1b3fe25fbe07494cc54" exitCode=0 Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.121510 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" event={"ID":"fe4973bb-df39-4221-bbd0-f637a10dd5b0","Type":"ContainerDied","Data":"940fc3ad402a4fa86a009f90dfa613197e63ae02d9eae1b3fe25fbe07494cc54"} Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.138793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerStarted","Data":"c10ef31bd99a4afa96bd1f7ae669d1f9d52c4cef490465b3e965864f39207d3f"} Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.154370 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.198137 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.199918 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:20 crc kubenswrapper[4835]: I0319 09:47:20.244269 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:47:21 crc kubenswrapper[4835]: I0319 09:47:21.181302 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerStarted","Data":"98f188a300841fb0911040d4b33ec63992a1f3be89cc608b7de5756e45ddaebc"} Mar 19 09:47:21 crc kubenswrapper[4835]: I0319 09:47:21.186512 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerStarted","Data":"81c6493f998adccb90b836fe660d4f766ecc0eca5eab4277fd63b5cd3fc9580f"} Mar 19 09:47:21 crc kubenswrapper[4835]: I0319 09:47:21.190881 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" event={"ID":"fe4973bb-df39-4221-bbd0-f637a10dd5b0","Type":"ContainerStarted","Data":"ed6522297857d17fb514001daaa8e6e2772e749cc5a3b4d0dddca4097ce0aaf1"} Mar 19 09:47:21 crc kubenswrapper[4835]: I0319 09:47:21.191201 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:21 crc kubenswrapper[4835]: I0319 09:47:21.218701 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" podStartSLOduration=5.218678603 podStartE2EDuration="5.218678603s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:21.213038663 +0000 UTC m=+1496.061637250" watchObservedRunningTime="2026-03-19 09:47:21.218678603 +0000 UTC m=+1496.067277190" Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.248206 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerStarted","Data":"e3bd3441a25b539a03723c617fe371c050379b4f47e2f7788f0d8f0fcfd94156"} Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.248285 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-log" containerID="cri-o://98f188a300841fb0911040d4b33ec63992a1f3be89cc608b7de5756e45ddaebc" gracePeriod=30 Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.248558 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-httpd" containerID="cri-o://e3bd3441a25b539a03723c617fe371c050379b4f47e2f7788f0d8f0fcfd94156" gracePeriod=30 Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.281941 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.281923319 podStartE2EDuration="6.281923319s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:22.281189289 +0000 UTC m=+1497.129787876" watchObservedRunningTime="2026-03-19 09:47:22.281923319 +0000 UTC m=+1497.130521906" Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.291432 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-log" containerID="cri-o://81c6493f998adccb90b836fe660d4f766ecc0eca5eab4277fd63b5cd3fc9580f" gracePeriod=30 Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.291692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerStarted","Data":"4358b3c08080a8d7e02c033ed19d5d490b2b79fa015f8d676c80c02e7d27dd5e"} Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.291979 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-httpd" containerID="cri-o://4358b3c08080a8d7e02c033ed19d5d490b2b79fa015f8d676c80c02e7d27dd5e" gracePeriod=30 Mar 19 09:47:22 crc kubenswrapper[4835]: I0319 09:47:22.343624 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.343601121 podStartE2EDuration="6.343601121s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:22.32931731 +0000 UTC m=+1497.177915897" watchObservedRunningTime="2026-03-19 09:47:22.343601121 +0000 UTC m=+1497.192199708" Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.308608 4835 generic.go:334] "Generic (PLEG): container finished" podID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerID="4358b3c08080a8d7e02c033ed19d5d490b2b79fa015f8d676c80c02e7d27dd5e" exitCode=143 Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.308984 4835 generic.go:334] "Generic (PLEG): container finished" podID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerID="81c6493f998adccb90b836fe660d4f766ecc0eca5eab4277fd63b5cd3fc9580f" exitCode=143 Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.308676 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerDied","Data":"4358b3c08080a8d7e02c033ed19d5d490b2b79fa015f8d676c80c02e7d27dd5e"} Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.309091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerDied","Data":"81c6493f998adccb90b836fe660d4f766ecc0eca5eab4277fd63b5cd3fc9580f"} Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.314403 4835 generic.go:334] "Generic (PLEG): container finished" podID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerID="e3bd3441a25b539a03723c617fe371c050379b4f47e2f7788f0d8f0fcfd94156" exitCode=0 Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.314521 4835 generic.go:334] "Generic (PLEG): container finished" podID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerID="98f188a300841fb0911040d4b33ec63992a1f3be89cc608b7de5756e45ddaebc" exitCode=143 Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.314521 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerDied","Data":"e3bd3441a25b539a03723c617fe371c050379b4f47e2f7788f0d8f0fcfd94156"} Mar 19 09:47:23 crc kubenswrapper[4835]: I0319 09:47:23.314576 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerDied","Data":"98f188a300841fb0911040d4b33ec63992a1f3be89cc608b7de5756e45ddaebc"} Mar 19 09:47:24 crc kubenswrapper[4835]: I0319 09:47:24.326452 4835 generic.go:334] "Generic (PLEG): container finished" podID="8678740e-623f-4062-8b98-eba5fe57f88e" containerID="091c3a8c428655cd1829849fef315dfcc00ba127038164eb8bce08bf31962482" exitCode=0 Mar 19 09:47:24 crc kubenswrapper[4835]: I0319 09:47:24.326524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c9nxs" event={"ID":"8678740e-623f-4062-8b98-eba5fe57f88e","Type":"ContainerDied","Data":"091c3a8c428655cd1829849fef315dfcc00ba127038164eb8bce08bf31962482"} Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.543963 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693364 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693490 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693515 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693840 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693926 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhtpw\" (UniqueName: \"kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.693950 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.694125 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.694500 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.694834 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs" (OuterVolumeSpecName: "logs") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.701423 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts" (OuterVolumeSpecName: "scripts") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.710556 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw" (OuterVolumeSpecName: "kube-api-access-nhtpw") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "kube-api-access-nhtpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.713879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688" (OuterVolumeSpecName: "glance") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "pvc-f195f04e-c0f1-495c-870b-981275804688". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.738394 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: E0319 09:47:25.773840 4835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs podName:1284d13d-bf1b-49b6-85be-954fa52a3c10 nodeName:}" failed. No retries permitted until 2026-03-19 09:47:26.273815828 +0000 UTC m=+1501.122414415 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10") : error deleting /var/lib/kubelet/pods/1284d13d-bf1b-49b6-85be-954fa52a3c10/volume-subpaths: remove /var/lib/kubelet/pods/1284d13d-bf1b-49b6-85be-954fa52a3c10/volume-subpaths: no such file or directory Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.777112 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data" (OuterVolumeSpecName: "config-data") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797063 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhtpw\" (UniqueName: \"kubernetes.io/projected/1284d13d-bf1b-49b6-85be-954fa52a3c10-kube-api-access-nhtpw\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797100 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1284d13d-bf1b-49b6-85be-954fa52a3c10-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797114 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797126 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797159 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") on node \"crc\" " Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.797173 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.834838 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.835004 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f195f04e-c0f1-495c-870b-981275804688" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688") on node "crc" Mar 19 09:47:25 crc kubenswrapper[4835]: I0319 09:47:25.899287 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.307617 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") pod \"1284d13d-bf1b-49b6-85be-954fa52a3c10\" (UID: \"1284d13d-bf1b-49b6-85be-954fa52a3c10\") " Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.318237 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1284d13d-bf1b-49b6-85be-954fa52a3c10" (UID: "1284d13d-bf1b-49b6-85be-954fa52a3c10"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.356005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1284d13d-bf1b-49b6-85be-954fa52a3c10","Type":"ContainerDied","Data":"ae723b8a930ff1e5cc46088c43ce47275f464a69fce7922deed8b55ba0174ca9"} Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.356058 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.356071 4835 scope.go:117] "RemoveContainer" containerID="4358b3c08080a8d7e02c033ed19d5d490b2b79fa015f8d676c80c02e7d27dd5e" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.416576 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1284d13d-bf1b-49b6-85be-954fa52a3c10-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.589907 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.603205 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.642889 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:26 crc kubenswrapper[4835]: E0319 09:47:26.643522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" containerName="init" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.643589 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" containerName="init" Mar 19 09:47:26 crc kubenswrapper[4835]: E0319 09:47:26.643667 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-httpd" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.643718 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-httpd" Mar 19 09:47:26 crc kubenswrapper[4835]: E0319 09:47:26.643848 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-log" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.643938 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-log" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.644211 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" containerName="init" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.644289 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-log" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.644354 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" containerName="glance-httpd" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.645952 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.649355 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.649544 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.661373 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.726162 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dp8\" (UniqueName: \"kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.726249 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.726369 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.730893 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.731014 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.731135 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.731280 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.731380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.832946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833008 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833055 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833153 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833194 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833275 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dp8\" (UniqueName: \"kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.833804 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.834590 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.836153 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.836182 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d183d018680155306ca255d36c5357de351d0f1f91a4bb616234d2bb906544a/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.844433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.845327 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.848298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.851838 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dp8\" (UniqueName: \"kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.856420 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.894526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:47:26 crc kubenswrapper[4835]: I0319 09:47:26.963111 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:47:27 crc kubenswrapper[4835]: I0319 09:47:27.345934 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:47:27 crc kubenswrapper[4835]: I0319 09:47:27.435376 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:47:27 crc kubenswrapper[4835]: I0319 09:47:27.435611 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" containerID="cri-o://6725e552866d61a67df83d48c088c8fc8fd24968468253d22c27509d1f40f7bf" gracePeriod=10 Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.448287 4835 generic.go:334] "Generic (PLEG): container finished" podID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerID="6725e552866d61a67df83d48c088c8fc8fd24968468253d22c27509d1f40f7bf" exitCode=0 Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.452366 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1284d13d-bf1b-49b6-85be-954fa52a3c10" path="/var/lib/kubelet/pods/1284d13d-bf1b-49b6-85be-954fa52a3c10/volumes" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.454210 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerDied","Data":"6725e552866d61a67df83d48c088c8fc8fd24968468253d22c27509d1f40f7bf"} Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.828625 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.913816 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.913951 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914005 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh44p\" (UniqueName: \"kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914100 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914175 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914281 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914354 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle\") pod \"16f10f93-a979-44fb-92e3-3b96b5f9354e\" (UID: \"16f10f93-a979-44fb-92e3-3b96b5f9354e\") " Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914431 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs" (OuterVolumeSpecName: "logs") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914502 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914858 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.914876 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16f10f93-a979-44fb-92e3-3b96b5f9354e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.923326 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p" (OuterVolumeSpecName: "kube-api-access-jh44p") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "kube-api-access-jh44p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.930884 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts" (OuterVolumeSpecName: "scripts") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:28 crc kubenswrapper[4835]: I0319 09:47:28.956182 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a" (OuterVolumeSpecName: "glance") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "pvc-9eba352b-5554-4046-b92a-7c10e5b6276a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.014540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.016693 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.016731 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh44p\" (UniqueName: \"kubernetes.io/projected/16f10f93-a979-44fb-92e3-3b96b5f9354e-kube-api-access-jh44p\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.016766 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.016796 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") on node \"crc\" " Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.023451 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data" (OuterVolumeSpecName: "config-data") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.051890 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.052067 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9eba352b-5554-4046-b92a-7c10e5b6276a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a") on node "crc" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.053241 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16f10f93-a979-44fb-92e3-3b96b5f9354e" (UID: "16f10f93-a979-44fb-92e3-3b96b5f9354e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.119224 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.119267 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.119277 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f10f93-a979-44fb-92e3-3b96b5f9354e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.464913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"16f10f93-a979-44fb-92e3-3b96b5f9354e","Type":"ContainerDied","Data":"f59ee6a2c7f4845f43efa0e5d914a9883ddd95039fefe3eb6499003ce3cbbf3a"} Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.464955 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.512076 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.521637 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.550014 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:29 crc kubenswrapper[4835]: E0319 09:47:29.550643 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-log" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.550661 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-log" Mar 19 09:47:29 crc kubenswrapper[4835]: E0319 09:47:29.550672 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-httpd" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.550680 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-httpd" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.551582 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-httpd" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.551646 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" containerName="glance-log" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.553706 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.557418 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.557482 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.561099 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633480 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvpq\" (UniqueName: \"kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633685 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633815 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.633966 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.634006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.634070 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.735547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.735973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.736107 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.736449 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.736625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvpq\" (UniqueName: \"kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.736633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.737446 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.738122 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.738031 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.740023 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.740598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.740859 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.740898 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89205a9f8fafa059e3c8d5aabb19b28af3510e2ed70a778ed91ec4c662cc8e20/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.743484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.744086 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.750354 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.759602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvpq\" (UniqueName: \"kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.800084 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:47:29 crc kubenswrapper[4835]: I0319 09:47:29.901625 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:30 crc kubenswrapper[4835]: I0319 09:47:30.419469 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f10f93-a979-44fb-92e3-3b96b5f9354e" path="/var/lib/kubelet/pods/16f10f93-a979-44fb-92e3-3b96b5f9354e/volumes" Mar 19 09:47:32 crc kubenswrapper[4835]: I0319 09:47:32.061451 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.175:5353: connect: connection refused" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.061125 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.175:5353: connect: connection refused" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.244780 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313209 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhm7h\" (UniqueName: \"kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313363 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313416 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313513 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313566 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.313660 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle\") pod \"8678740e-623f-4062-8b98-eba5fe57f88e\" (UID: \"8678740e-623f-4062-8b98-eba5fe57f88e\") " Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.318825 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts" (OuterVolumeSpecName: "scripts") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.319300 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.333016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h" (OuterVolumeSpecName: "kube-api-access-lhm7h") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "kube-api-access-lhm7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.341479 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.378156 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data" (OuterVolumeSpecName: "config-data") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.385987 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8678740e-623f-4062-8b98-eba5fe57f88e" (UID: "8678740e-623f-4062-8b98-eba5fe57f88e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419554 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhm7h\" (UniqueName: \"kubernetes.io/projected/8678740e-623f-4062-8b98-eba5fe57f88e-kube-api-access-lhm7h\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419585 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419597 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419608 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419620 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.419630 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8678740e-623f-4062-8b98-eba5fe57f88e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.564337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c9nxs" event={"ID":"8678740e-623f-4062-8b98-eba5fe57f88e","Type":"ContainerDied","Data":"6ce2c2ead237b71c6ccf7d118f3660b174836699d45ea073fcb12b5f84861790"} Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.564379 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ce2c2ead237b71c6ccf7d118f3660b174836699d45ea073fcb12b5f84861790" Mar 19 09:47:37 crc kubenswrapper[4835]: I0319 09:47:37.564433 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c9nxs" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.348427 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c9nxs"] Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.362670 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c9nxs"] Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.430742 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8678740e-623f-4062-8b98-eba5fe57f88e" path="/var/lib/kubelet/pods/8678740e-623f-4062-8b98-eba5fe57f88e/volumes" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.443363 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xrdjb"] Mar 19 09:47:38 crc kubenswrapper[4835]: E0319 09:47:38.444126 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8678740e-623f-4062-8b98-eba5fe57f88e" containerName="keystone-bootstrap" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.444217 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8678740e-623f-4062-8b98-eba5fe57f88e" containerName="keystone-bootstrap" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.444607 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8678740e-623f-4062-8b98-eba5fe57f88e" containerName="keystone-bootstrap" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.445608 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.449452 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.449605 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rp4k2" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.449748 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.449846 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.449985 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.466555 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xrdjb"] Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.547639 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.547736 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.547806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.547945 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.547970 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.548039 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qgwv\" (UniqueName: \"kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652261 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652486 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qgwv\" (UniqueName: \"kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652630 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.652683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.664801 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.680614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.683093 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.683185 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.685696 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.703315 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qgwv\" (UniqueName: \"kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv\") pod \"keystone-bootstrap-xrdjb\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:38 crc kubenswrapper[4835]: I0319 09:47:38.780267 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:40 crc kubenswrapper[4835]: I0319 09:47:40.610959 4835 generic.go:334] "Generic (PLEG): container finished" podID="cdf7c073-aa50-4b3b-aedb-7e77be63f85a" containerID="5892f1f672103d6dcb8484f3156fe6192a7117db7deab0d80d609b3dba5c4dfb" exitCode=0 Mar 19 09:47:40 crc kubenswrapper[4835]: I0319 09:47:40.611544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c5lzw" event={"ID":"cdf7c073-aa50-4b3b-aedb-7e77be63f85a","Type":"ContainerDied","Data":"5892f1f672103d6dcb8484f3156fe6192a7117db7deab0d80d609b3dba5c4dfb"} Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.287687 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.288367 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4s5np,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mw6s2_openstack(410b03bf-d2bc-4992-b467-db7947c52078): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.289557 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mw6s2" podUID="410b03bf-d2bc-4992-b467-db7947c52078" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.676847 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mw6s2" podUID="410b03bf-d2bc-4992-b467-db7947c52078" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.736945 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.737092 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgbf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-4jmpv_openstack(89dd21d1-34a1-4d91-a7cb-32840eda818e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:47:46 crc kubenswrapper[4835]: E0319 09:47:46.738255 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-4jmpv" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.744677 4835 scope.go:117] "RemoveContainer" containerID="81c6493f998adccb90b836fe660d4f766ecc0eca5eab4277fd63b5cd3fc9580f" Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.898003 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.904336 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.963828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle\") pod \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.963913 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.963997 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964041 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht49\" (UniqueName: \"kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49\") pod \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964131 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964157 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wnzg\" (UniqueName: \"kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964199 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964229 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config\") pod \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\" (UID: \"cdf7c073-aa50-4b3b-aedb-7e77be63f85a\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.964279 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config\") pod \"71a2a8d4-8f41-4f53-a725-067d509fa616\" (UID: \"71a2a8d4-8f41-4f53-a725-067d509fa616\") " Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.988774 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg" (OuterVolumeSpecName: "kube-api-access-8wnzg") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "kube-api-access-8wnzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:46 crc kubenswrapper[4835]: I0319 09:47:46.988859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49" (OuterVolumeSpecName: "kube-api-access-mht49") pod "cdf7c073-aa50-4b3b-aedb-7e77be63f85a" (UID: "cdf7c073-aa50-4b3b-aedb-7e77be63f85a"). InnerVolumeSpecName "kube-api-access-mht49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.041567 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.060039 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config" (OuterVolumeSpecName: "config") pod "cdf7c073-aa50-4b3b-aedb-7e77be63f85a" (UID: "cdf7c073-aa50-4b3b-aedb-7e77be63f85a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.062063 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.175:5353: i/o timeout" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.062162 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.065562 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht49\" (UniqueName: \"kubernetes.io/projected/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-kube-api-access-mht49\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.065591 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wnzg\" (UniqueName: \"kubernetes.io/projected/71a2a8d4-8f41-4f53-a725-067d509fa616-kube-api-access-8wnzg\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.065601 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.065610 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.076075 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdf7c073-aa50-4b3b-aedb-7e77be63f85a" (UID: "cdf7c073-aa50-4b3b-aedb-7e77be63f85a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.089391 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config" (OuterVolumeSpecName: "config") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.101381 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.102001 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.107785 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71a2a8d4-8f41-4f53-a725-067d509fa616" (UID: "71a2a8d4-8f41-4f53-a725-067d509fa616"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.168363 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.168401 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.168411 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.168422 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71a2a8d4-8f41-4f53-a725-067d509fa616-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.168431 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf7c073-aa50-4b3b-aedb-7e77be63f85a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.685345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-c5lzw" event={"ID":"cdf7c073-aa50-4b3b-aedb-7e77be63f85a","Type":"ContainerDied","Data":"ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c"} Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.685374 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-c5lzw" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.685395 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1e6a5af3d4c239ea19b4f46d6dab354543e8df7bb78589a023164140b1400c" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.687325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" event={"ID":"71a2a8d4-8f41-4f53-a725-067d509fa616","Type":"ContainerDied","Data":"8c3b4800b98a369b3a87a686b5c1694721c209e5128babafd807a662f5e39d22"} Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.687443 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-rrsfb" Mar 19 09:47:47 crc kubenswrapper[4835]: E0319 09:47:47.699723 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-4jmpv" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.771517 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:47:47 crc kubenswrapper[4835]: I0319 09:47:47.781564 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-rrsfb"] Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.265756 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.266613 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.266626 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.266651 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf7c073-aa50-4b3b-aedb-7e77be63f85a" containerName="neutron-db-sync" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.266658 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf7c073-aa50-4b3b-aedb-7e77be63f85a" containerName="neutron-db-sync" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.266672 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="init" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.266678 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="init" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.266892 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf7c073-aa50-4b3b-aedb-7e77be63f85a" containerName="neutron-db-sync" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.266908 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" containerName="dnsmasq-dns" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.267994 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.301340 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.308573 4835 scope.go:117] "RemoveContainer" containerID="e3bd3441a25b539a03723c617fe371c050379b4f47e2f7788f0d8f0fcfd94156" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.327866 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.328028 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jsvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mkr8b_openstack(2161ebbe-ed84-49a9-b0b1-b74304fd8b28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.329141 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mkr8b" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.350150 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.354886 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.359787 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.360047 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.360103 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-55tz7" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.360161 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.397561 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.397601 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.397621 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.397655 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrbk\" (UniqueName: \"kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.399176 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.400247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.400599 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.432650 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a2a8d4-8f41-4f53-a725-067d509fa616" path="/var/lib/kubelet/pods/71a2a8d4-8f41-4f53-a725-067d509fa616/volumes" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.479990 4835 scope.go:117] "RemoveContainer" containerID="98f188a300841fb0911040d4b33ec63992a1f3be89cc608b7de5756e45ddaebc" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503246 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503435 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503570 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503638 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrbk\" (UniqueName: \"kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503679 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503699 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98vw\" (UniqueName: \"kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503795 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.503821 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.505301 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.505820 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.507601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.516494 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.517020 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.545683 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrbk\" (UniqueName: \"kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk\") pod \"dnsmasq-dns-55f844cf75-sjjm8\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.590397 4835 scope.go:117] "RemoveContainer" containerID="6725e552866d61a67df83d48c088c8fc8fd24968468253d22c27509d1f40f7bf" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.607106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.607220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.607244 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98vw\" (UniqueName: \"kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.607279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.607330 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.614791 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.621720 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.627008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.627587 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.627836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.634596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98vw\" (UniqueName: \"kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw\") pod \"neutron-b7bcd9fc8-zjbvb\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.639832 4835 scope.go:117] "RemoveContainer" containerID="d6cd8743ad520470e7f4f858eaaca7b3ca8abdf34b505b7aebf28cf03899d084" Mar 19 09:47:48 crc kubenswrapper[4835]: E0319 09:47:48.774191 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-mkr8b" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" Mar 19 09:47:48 crc kubenswrapper[4835]: I0319 09:47:48.845563 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.113069 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.197548 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xrdjb"] Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.407241 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.710949 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.733758 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.862004 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerStarted","Data":"ac7e4c0970e84d95e6c65037a78e2addc754236d5e42a603a6f92c7a801534e1"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.872236 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerStarted","Data":"184d89364695c9a53b6bd8b307402f0d9347bb174b02395966e18a44e6fe8a2a"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.891796 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" event={"ID":"39529821-be7f-4bb6-9e90-3b6ba425e639","Type":"ContainerStarted","Data":"fbcdc183374d0c00688ab5d93fc32abb97157a1e691d27d397a91be1b21f9240"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.893814 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2rdm5" event={"ID":"777ab445-6d41-49a9-b87c-67ed7503cc5b","Type":"ContainerStarted","Data":"055506f95cd38c467b5deff11a4e9a1b6c84166480d1511368ff316282f5ce74"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.900091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerStarted","Data":"bdb5b9aab1e7d9d20a30ba02a0bd6cb67f4b1a3d7411457aded4deac37be1795"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.906840 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xrdjb" event={"ID":"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67","Type":"ContainerStarted","Data":"ba0ca96b9b64c651feb894366d6f433e21922c6447b06238873253ea22ae1733"} Mar 19 09:47:49 crc kubenswrapper[4835]: I0319 09:47:49.931700 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2rdm5" podStartSLOduration=5.496800887 podStartE2EDuration="33.931678683s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="2026-03-19 09:47:18.328682996 +0000 UTC m=+1493.177281583" lastFinishedPulling="2026-03-19 09:47:46.763560792 +0000 UTC m=+1521.612159379" observedRunningTime="2026-03-19 09:47:49.926914926 +0000 UTC m=+1524.775513533" watchObservedRunningTime="2026-03-19 09:47:49.931678683 +0000 UTC m=+1524.780277270" Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.503085 4835 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2f82c9c2-4f39-412a-8a03-4a463b4b4bfe"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f82c9c2-4f39-412a-8a03-4a463b4b4bfe] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f82c9c2_4f39_412a_8a03_4a463b4b4bfe.slice" Mar 19 09:47:50 crc kubenswrapper[4835]: E0319 09:47:50.503337 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2f82c9c2-4f39-412a-8a03-4a463b4b4bfe] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2f82c9c2-4f39-412a-8a03-4a463b4b4bfe] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2f82c9c2_4f39_412a_8a03_4a463b4b4bfe.slice" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" podUID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.919112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerStarted","Data":"0176626d9fd33b2d9cd7286414558f248abac68798ee6b1d280ebbb62215e2c5"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.922253 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerStarted","Data":"78a3c42296f55f6367bb88e0e1ed4ef75f39fcbe1fbacc010f3b939394f468ea"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.922277 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerStarted","Data":"d6970c892339bf0566200f3dc9127a92fedb590a2694385e9d175311825617b0"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.922288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerStarted","Data":"18f6a4bb6986edb573efaa6bfd395b3483d4362efb98960130b22243215b18f6"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.922650 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.924270 4835 generic.go:334] "Generic (PLEG): container finished" podID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerID="9982c40df49c0c62443658109b845382155cff7eb41e05d10c473e87e3de734a" exitCode=0 Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.924339 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" event={"ID":"39529821-be7f-4bb6-9e90-3b6ba425e639","Type":"ContainerDied","Data":"9982c40df49c0c62443658109b845382155cff7eb41e05d10c473e87e3de734a"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.926504 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerStarted","Data":"6145fee7cb692671de67b9f4efa736538c1b33e71134adacfff51490c26d1bb3"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.928471 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-j8vqq" Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.928475 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xrdjb" event={"ID":"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67","Type":"ContainerStarted","Data":"2b1fa8fa12a3487a89626ace1ecaf3bb551ec78dcfe5eb5de0a4c5f7ab6c9714"} Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.966607 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xrdjb" podStartSLOduration=12.966586824 podStartE2EDuration="12.966586824s" podCreationTimestamp="2026-03-19 09:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:50.963856141 +0000 UTC m=+1525.812454738" watchObservedRunningTime="2026-03-19 09:47:50.966586824 +0000 UTC m=+1525.815185411" Mar 19 09:47:50 crc kubenswrapper[4835]: I0319 09:47:50.972633 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b7bcd9fc8-zjbvb" podStartSLOduration=2.972618734 podStartE2EDuration="2.972618734s" podCreationTimestamp="2026-03-19 09:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:50.94240852 +0000 UTC m=+1525.791007127" watchObservedRunningTime="2026-03-19 09:47:50.972618734 +0000 UTC m=+1525.821217321" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.043831 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.063803 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-j8vqq"] Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.134072 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.141832 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.147192 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.147923 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.164335 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.334882 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.334924 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.334964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.335405 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzr8n\" (UniqueName: \"kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.335520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.335919 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.336026 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437511 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437527 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437565 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzr8n\" (UniqueName: \"kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.437633 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.450781 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.458466 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.461880 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.465146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.466049 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.466760 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzr8n\" (UniqueName: \"kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.467817 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs\") pod \"neutron-5c75cdcf85-fhsqj\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.766369 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:51 crc kubenswrapper[4835]: I0319 09:47:51.961455 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerStarted","Data":"ff636b8ace26f3a68c898e60bc6db9609e6fb220bca6425a914cba993210d0b7"} Mar 19 09:47:52 crc kubenswrapper[4835]: I0319 09:47:52.422086 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f82c9c2-4f39-412a-8a03-4a463b4b4bfe" path="/var/lib/kubelet/pods/2f82c9c2-4f39-412a-8a03-4a463b4b4bfe/volumes" Mar 19 09:47:52 crc kubenswrapper[4835]: I0319 09:47:52.719258 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.719239013 podStartE2EDuration="26.719239013s" podCreationTimestamp="2026-03-19 09:47:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:52.022098094 +0000 UTC m=+1526.870696701" watchObservedRunningTime="2026-03-19 09:47:52.719239013 +0000 UTC m=+1527.567837600" Mar 19 09:47:52 crc kubenswrapper[4835]: I0319 09:47:52.727465 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:47:52 crc kubenswrapper[4835]: W0319 09:47:52.727877 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77af32b3_205b_4928_bed1_719937fee8fa.slice/crio-0606f1448058cf48375352f314d9b06cc0102936f45cfd61d0ebf7c2a253d270 WatchSource:0}: Error finding container 0606f1448058cf48375352f314d9b06cc0102936f45cfd61d0ebf7c2a253d270: Status 404 returned error can't find the container with id 0606f1448058cf48375352f314d9b06cc0102936f45cfd61d0ebf7c2a253d270 Mar 19 09:47:52 crc kubenswrapper[4835]: I0319 09:47:52.972520 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerStarted","Data":"0606f1448058cf48375352f314d9b06cc0102936f45cfd61d0ebf7c2a253d270"} Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.003305 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" event={"ID":"39529821-be7f-4bb6-9e90-3b6ba425e639","Type":"ContainerStarted","Data":"2e5bea20da4fe0d37d396c723aae58e858ade156b04de961ee99399f03ea7842"} Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.003930 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.006153 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerStarted","Data":"913f1f440bdd8254f62a7f864525ce98361ace610a879f24685813adab987e7e"} Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.009613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerStarted","Data":"0147565355a1e16a24b8881476ad11558f8d50a68c44e87936dd43a70e0530ff"} Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.051881 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" podStartSLOduration=8.051855543 podStartE2EDuration="8.051855543s" podCreationTimestamp="2026-03-19 09:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:56.027080293 +0000 UTC m=+1530.875678880" watchObservedRunningTime="2026-03-19 09:47:56.051855543 +0000 UTC m=+1530.900454130" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.075254 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=27.075226606 podStartE2EDuration="27.075226606s" podCreationTimestamp="2026-03-19 09:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:56.052170991 +0000 UTC m=+1530.900769578" watchObservedRunningTime="2026-03-19 09:47:56.075226606 +0000 UTC m=+1530.923825193" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.964872 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.965242 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.965260 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 09:47:56 crc kubenswrapper[4835]: I0319 09:47:56.965269 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.001014 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.019241 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.025211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerStarted","Data":"e35ff45d123572ad5f48d800141817f914920bd04a77a91bb7fd659079dec340"} Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.027690 4835 generic.go:334] "Generic (PLEG): container finished" podID="777ab445-6d41-49a9-b87c-67ed7503cc5b" containerID="055506f95cd38c467b5deff11a4e9a1b6c84166480d1511368ff316282f5ce74" exitCode=0 Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.027797 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2rdm5" event={"ID":"777ab445-6d41-49a9-b87c-67ed7503cc5b","Type":"ContainerDied","Data":"055506f95cd38c467b5deff11a4e9a1b6c84166480d1511368ff316282f5ce74"} Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.030791 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerStarted","Data":"d0b2bb052eea070f42b7e38055bcfa0b4a4bde4272a400cfe87afd8664681ca2"} Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.030880 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.033115 4835 generic.go:334] "Generic (PLEG): container finished" podID="08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" containerID="2b1fa8fa12a3487a89626ace1ecaf3bb551ec78dcfe5eb5de0a4c5f7ab6c9714" exitCode=0 Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.033229 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xrdjb" event={"ID":"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67","Type":"ContainerDied","Data":"2b1fa8fa12a3487a89626ace1ecaf3bb551ec78dcfe5eb5de0a4c5f7ab6c9714"} Mar 19 09:47:57 crc kubenswrapper[4835]: I0319 09:47:57.113502 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c75cdcf85-fhsqj" podStartSLOduration=6.113483705 podStartE2EDuration="6.113483705s" podCreationTimestamp="2026-03-19 09:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:47:57.104888096 +0000 UTC m=+1531.953486683" watchObservedRunningTime="2026-03-19 09:47:57.113483705 +0000 UTC m=+1531.962082292" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.579516 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.589620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.650973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.651730 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qgwv\" (UniqueName: \"kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.651828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data\") pod \"777ab445-6d41-49a9-b87c-67ed7503cc5b\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.651933 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs\") pod \"777ab445-6d41-49a9-b87c-67ed7503cc5b\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.651968 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.651999 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts\") pod \"777ab445-6d41-49a9-b87c-67ed7503cc5b\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.652028 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle\") pod \"777ab445-6d41-49a9-b87c-67ed7503cc5b\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.652079 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpmrv\" (UniqueName: \"kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv\") pod \"777ab445-6d41-49a9-b87c-67ed7503cc5b\" (UID: \"777ab445-6d41-49a9-b87c-67ed7503cc5b\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.652108 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.652174 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.652263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data\") pod \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\" (UID: \"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67\") " Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.654473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs" (OuterVolumeSpecName: "logs") pod "777ab445-6d41-49a9-b87c-67ed7503cc5b" (UID: "777ab445-6d41-49a9-b87c-67ed7503cc5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.660879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv" (OuterVolumeSpecName: "kube-api-access-2qgwv") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "kube-api-access-2qgwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.664639 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv" (OuterVolumeSpecName: "kube-api-access-rpmrv") pod "777ab445-6d41-49a9-b87c-67ed7503cc5b" (UID: "777ab445-6d41-49a9-b87c-67ed7503cc5b"). InnerVolumeSpecName "kube-api-access-rpmrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.670152 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts" (OuterVolumeSpecName: "scripts") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.671226 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.673683 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.676867 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts" (OuterVolumeSpecName: "scripts") pod "777ab445-6d41-49a9-b87c-67ed7503cc5b" (UID: "777ab445-6d41-49a9-b87c-67ed7503cc5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.697558 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data" (OuterVolumeSpecName: "config-data") pod "777ab445-6d41-49a9-b87c-67ed7503cc5b" (UID: "777ab445-6d41-49a9-b87c-67ed7503cc5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.709160 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.723026 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "777ab445-6d41-49a9-b87c-67ed7503cc5b" (UID: "777ab445-6d41-49a9-b87c-67ed7503cc5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.736606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data" (OuterVolumeSpecName: "config-data") pod "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" (UID: "08dbbf0e-f67b-448a-b9b6-cbd738b6bf67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755875 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/777ab445-6d41-49a9-b87c-67ed7503cc5b-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755919 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755928 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755938 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755950 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpmrv\" (UniqueName: \"kubernetes.io/projected/777ab445-6d41-49a9-b87c-67ed7503cc5b-kube-api-access-rpmrv\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755959 4835 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755967 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755975 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755984 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.755993 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qgwv\" (UniqueName: \"kubernetes.io/projected/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67-kube-api-access-2qgwv\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:58 crc kubenswrapper[4835]: I0319 09:47:58.756000 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/777ab445-6d41-49a9-b87c-67ed7503cc5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.056816 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xrdjb" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.056810 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xrdjb" event={"ID":"08dbbf0e-f67b-448a-b9b6-cbd738b6bf67","Type":"ContainerDied","Data":"ba0ca96b9b64c651feb894366d6f433e21922c6447b06238873253ea22ae1733"} Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.057224 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0ca96b9b64c651feb894366d6f433e21922c6447b06238873253ea22ae1733" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.067249 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2rdm5" event={"ID":"777ab445-6d41-49a9-b87c-67ed7503cc5b","Type":"ContainerDied","Data":"d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c"} Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.067297 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9954561d1d9003dbeb467c1d40a367ed737101426ad23d8050d95c2edeab59c" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.067312 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2rdm5" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.253715 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f8d94d56-dsx7s"] Mar 19 09:47:59 crc kubenswrapper[4835]: E0319 09:47:59.254246 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" containerName="keystone-bootstrap" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.254263 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" containerName="keystone-bootstrap" Mar 19 09:47:59 crc kubenswrapper[4835]: E0319 09:47:59.254293 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777ab445-6d41-49a9-b87c-67ed7503cc5b" containerName="placement-db-sync" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.254299 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="777ab445-6d41-49a9-b87c-67ed7503cc5b" containerName="placement-db-sync" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.254488 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="777ab445-6d41-49a9-b87c-67ed7503cc5b" containerName="placement-db-sync" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.254503 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" containerName="keystone-bootstrap" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.255318 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.265410 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.265608 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.265725 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rp4k2" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.265865 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.265731 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.269906 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.288209 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f8d94d56-dsx7s"] Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.327440 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.329199 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.332125 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.332932 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.333120 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.333263 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n69wh" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.333406 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369412 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-internal-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369498 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-combined-ca-bundle\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-config-data\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369603 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-public-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369637 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f27q\" (UniqueName: \"kubernetes.io/projected/71b8fa5f-56e9-49ee-b55d-6136c30a460d-kube-api-access-5f27q\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-fernet-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369696 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-credential-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.369734 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-scripts\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.398843 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-config-data\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471438 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471485 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdtd\" (UniqueName: \"kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471509 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-public-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471553 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f27q\" (UniqueName: \"kubernetes.io/projected/71b8fa5f-56e9-49ee-b55d-6136c30a460d-kube-api-access-5f27q\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-fernet-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471634 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-credential-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-scripts\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-internal-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471776 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471830 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-combined-ca-bundle\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.471896 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.479441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-internal-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.487609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-config-data\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.488636 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-credential-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.489182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-public-tls-certs\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.491357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-fernet-keys\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.495728 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-combined-ca-bundle\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.498151 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71b8fa5f-56e9-49ee-b55d-6136c30a460d-scripts\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.510407 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f27q\" (UniqueName: \"kubernetes.io/projected/71b8fa5f-56e9-49ee-b55d-6136c30a460d-kube-api-access-5f27q\") pod \"keystone-f8d94d56-dsx7s\" (UID: \"71b8fa5f-56e9-49ee-b55d-6136c30a460d\") " pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580225 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580374 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580437 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580483 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.580505 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdtd\" (UniqueName: \"kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.581127 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.582033 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.590488 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.598192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.607399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.612222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdtd\" (UniqueName: \"kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.612251 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.615498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data\") pod \"placement-cbc57b49b-x9wm9\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.674266 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.678786 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-74ff56c748-fkbqq"] Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.680637 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.711403 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-74ff56c748-fkbqq"] Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.794873 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-combined-ca-bundle\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.795542 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-scripts\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.795716 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgd79\" (UniqueName: \"kubernetes.io/projected/7c130368-e60e-45c8-ad42-45b89cf92cdb-kube-api-access-jgd79\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.796108 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-config-data\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.796208 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-internal-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.796283 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c130368-e60e-45c8-ad42-45b89cf92cdb-logs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.796400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-public-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-public-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898664 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-combined-ca-bundle\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898720 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-scripts\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898805 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgd79\" (UniqueName: \"kubernetes.io/projected/7c130368-e60e-45c8-ad42-45b89cf92cdb-kube-api-access-jgd79\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898941 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-config-data\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.898998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-internal-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.899040 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c130368-e60e-45c8-ad42-45b89cf92cdb-logs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.899573 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c130368-e60e-45c8-ad42-45b89cf92cdb-logs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.903163 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.903237 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.903249 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.903259 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.913034 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-config-data\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.914706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-public-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.915000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-scripts\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.926269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-combined-ca-bundle\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.926376 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgd79\" (UniqueName: \"kubernetes.io/projected/7c130368-e60e-45c8-ad42-45b89cf92cdb-kube-api-access-jgd79\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:47:59 crc kubenswrapper[4835]: I0319 09:47:59.969995 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c130368-e60e-45c8-ad42-45b89cf92cdb-internal-tls-certs\") pod \"placement-74ff56c748-fkbqq\" (UID: \"7c130368-e60e-45c8-ad42-45b89cf92cdb\") " pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.007125 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.019790 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.037718 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.177708 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565228-zvnlw"] Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.179991 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.182414 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.184811 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.185156 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.187266 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565228-zvnlw"] Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.307305 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvwz\" (UniqueName: \"kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz\") pod \"auto-csr-approver-29565228-zvnlw\" (UID: \"c5787acc-561d-4cca-b822-bc7ae5cde5ea\") " pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.408871 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvwz\" (UniqueName: \"kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz\") pod \"auto-csr-approver-29565228-zvnlw\" (UID: \"c5787acc-561d-4cca-b822-bc7ae5cde5ea\") " pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.441667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvwz\" (UniqueName: \"kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz\") pod \"auto-csr-approver-29565228-zvnlw\" (UID: \"c5787acc-561d-4cca-b822-bc7ae5cde5ea\") " pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:00 crc kubenswrapper[4835]: I0319 09:48:00.499524 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:01 crc kubenswrapper[4835]: I0319 09:48:01.698701 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 09:48:01 crc kubenswrapper[4835]: I0319 09:48:01.699335 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:48:01 crc kubenswrapper[4835]: I0319 09:48:01.710278 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 09:48:02 crc kubenswrapper[4835]: I0319 09:48:02.660603 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 09:48:02 crc kubenswrapper[4835]: I0319 09:48:02.661074 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:48:02 crc kubenswrapper[4835]: I0319 09:48:02.743563 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 09:48:03 crc kubenswrapper[4835]: I0319 09:48:03.617446 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:48:03 crc kubenswrapper[4835]: I0319 09:48:03.713208 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:48:03 crc kubenswrapper[4835]: I0319 09:48:03.713444 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="dnsmasq-dns" containerID="cri-o://ed6522297857d17fb514001daaa8e6e2772e749cc5a3b4d0dddca4097ce0aaf1" gracePeriod=10 Mar 19 09:48:04 crc kubenswrapper[4835]: I0319 09:48:04.160202 4835 generic.go:334] "Generic (PLEG): container finished" podID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerID="ed6522297857d17fb514001daaa8e6e2772e749cc5a3b4d0dddca4097ce0aaf1" exitCode=0 Mar 19 09:48:04 crc kubenswrapper[4835]: I0319 09:48:04.160256 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" event={"ID":"fe4973bb-df39-4221-bbd0-f637a10dd5b0","Type":"ContainerDied","Data":"ed6522297857d17fb514001daaa8e6e2772e749cc5a3b4d0dddca4097ce0aaf1"} Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.838185 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.993649 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.993996 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.994042 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.994081 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.994125 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4wng\" (UniqueName: \"kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:05 crc kubenswrapper[4835]: I0319 09:48:05.994175 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0\") pod \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\" (UID: \"fe4973bb-df39-4221-bbd0-f637a10dd5b0\") " Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.035244 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng" (OuterVolumeSpecName: "kube-api-access-f4wng") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "kube-api-access-f4wng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.097610 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4wng\" (UniqueName: \"kubernetes.io/projected/fe4973bb-df39-4221-bbd0-f637a10dd5b0-kube-api-access-f4wng\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.192902 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-74ff56c748-fkbqq"] Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.202453 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" event={"ID":"fe4973bb-df39-4221-bbd0-f637a10dd5b0","Type":"ContainerDied","Data":"1dc58df72f68eeb61c007fc15e07f18577ce9c8e16ce4d2db078538f01b1b602"} Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.202581 4835 scope.go:117] "RemoveContainer" containerID="ed6522297857d17fb514001daaa8e6e2772e749cc5a3b4d0dddca4097ce0aaf1" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.202536 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lwf7t" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.241223 4835 scope.go:117] "RemoveContainer" containerID="940fc3ad402a4fa86a009f90dfa613197e63ae02d9eae1b3fe25fbe07494cc54" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.286714 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565228-zvnlw"] Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.305238 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f8d94d56-dsx7s"] Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.396139 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.422319 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.422560 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.437390 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.453296 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config" (OuterVolumeSpecName: "config") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.470193 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.471863 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.486016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe4973bb-df39-4221-bbd0-f637a10dd5b0" (UID: "fe4973bb-df39-4221-bbd0-f637a10dd5b0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.511474 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.511504 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.511517 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.511528 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.511541 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe4973bb-df39-4221-bbd0-f637a10dd5b0-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.725399 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:48:06 crc kubenswrapper[4835]: I0319 09:48:06.745764 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lwf7t"] Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.217383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f8d94d56-dsx7s" event={"ID":"71b8fa5f-56e9-49ee-b55d-6136c30a460d","Type":"ContainerStarted","Data":"c6e9bf3762721e0ef3d0c9dd0b5442f9673f05535d2290b37a59f25f3e968dcf"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.217721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f8d94d56-dsx7s" event={"ID":"71b8fa5f-56e9-49ee-b55d-6136c30a460d","Type":"ContainerStarted","Data":"315926520dc9310ba6d13e014d6ae3107b64d3c0f8839940897bef2931ca0e31"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.218248 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.219045 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4jmpv" event={"ID":"89dd21d1-34a1-4d91-a7cb-32840eda818e","Type":"ContainerStarted","Data":"a9e9091d411b21811d51abd7cb26a4f3593bc327ae9bf9f264c76ea0b5104af7"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.221635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerStarted","Data":"e3b748a14f24225a9186588b569348543138f7e531869e7354ef71d69a4b25b3"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.224117 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" event={"ID":"c5787acc-561d-4cca-b822-bc7ae5cde5ea","Type":"ContainerStarted","Data":"97a9f72cc73d16782965ee31c8e1c0fbcbe86fec2a18af94df522b71e9f5a411"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.227095 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mw6s2" event={"ID":"410b03bf-d2bc-4992-b467-db7947c52078","Type":"ContainerStarted","Data":"dfcd6ca68762255b373479b75d2123ff0712b7d734ddffde0d6af98361a5b1b8"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.229730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74ff56c748-fkbqq" event={"ID":"7c130368-e60e-45c8-ad42-45b89cf92cdb","Type":"ContainerStarted","Data":"850e756fed6cece275d15361cdf5eed66794c5c192cf6988f17adec41342383c"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.229846 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74ff56c748-fkbqq" event={"ID":"7c130368-e60e-45c8-ad42-45b89cf92cdb","Type":"ContainerStarted","Data":"44c7c6308a279f405f7e0f156e66569fab71c299f46a0ff370374d4eb6251095"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.229858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-74ff56c748-fkbqq" event={"ID":"7c130368-e60e-45c8-ad42-45b89cf92cdb","Type":"ContainerStarted","Data":"8da697f95a4412d6514ca1f93d66beed5ca983b8a2a166d7f2fb7b66d8dc942b"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.230527 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.230561 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.232579 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mkr8b" event={"ID":"2161ebbe-ed84-49a9-b0b1-b74304fd8b28","Type":"ContainerStarted","Data":"b7b8402d3c5a091035650dff3c5aa76455642f20a9ee98016450c1f7bbf0fa07"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.235073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerStarted","Data":"966fdc1b648f492ef2f8c8a23310243a1a650bd7c102ab114949cf3db6294c66"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.235110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerStarted","Data":"e209a3265c7448cd5c4f37948a8d87e968509ae27d3bc659f8c1edf93ba1b1cd"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.235126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerStarted","Data":"e14bf0100a40e1a9dbbcb93b30f85a3b810c595a758b2ba4447151cb6d5c7681"} Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.235384 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.235421 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.339984 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-74ff56c748-fkbqq" podStartSLOduration=8.339959182 podStartE2EDuration="8.339959182s" podCreationTimestamp="2026-03-19 09:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:07.331523557 +0000 UTC m=+1542.180122144" watchObservedRunningTime="2026-03-19 09:48:07.339959182 +0000 UTC m=+1542.188557769" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.345845 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f8d94d56-dsx7s" podStartSLOduration=8.345825968 podStartE2EDuration="8.345825968s" podCreationTimestamp="2026-03-19 09:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:07.271136199 +0000 UTC m=+1542.119734786" watchObservedRunningTime="2026-03-19 09:48:07.345825968 +0000 UTC m=+1542.194424565" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.424966 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-cbc57b49b-x9wm9" podStartSLOduration=8.424945814 podStartE2EDuration="8.424945814s" podCreationTimestamp="2026-03-19 09:47:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:07.358600377 +0000 UTC m=+1542.207198994" watchObservedRunningTime="2026-03-19 09:48:07.424945814 +0000 UTC m=+1542.273544401" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.442270 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mkr8b" podStartSLOduration=3.650573567 podStartE2EDuration="51.442243864s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="2026-03-19 09:47:18.057396315 +0000 UTC m=+1492.905994902" lastFinishedPulling="2026-03-19 09:48:05.849066612 +0000 UTC m=+1540.697665199" observedRunningTime="2026-03-19 09:48:07.389749787 +0000 UTC m=+1542.238348374" watchObservedRunningTime="2026-03-19 09:48:07.442243864 +0000 UTC m=+1542.290842451" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.449939 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-4jmpv" podStartSLOduration=3.442863298 podStartE2EDuration="51.449916749s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="2026-03-19 09:47:17.836874224 +0000 UTC m=+1492.685472811" lastFinishedPulling="2026-03-19 09:48:05.843927675 +0000 UTC m=+1540.692526262" observedRunningTime="2026-03-19 09:48:07.41502246 +0000 UTC m=+1542.263621067" watchObservedRunningTime="2026-03-19 09:48:07.449916749 +0000 UTC m=+1542.298515346" Mar 19 09:48:07 crc kubenswrapper[4835]: I0319 09:48:07.462104 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mw6s2" podStartSLOduration=4.098947383 podStartE2EDuration="51.462081312s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="2026-03-19 09:47:18.44002821 +0000 UTC m=+1493.288626797" lastFinishedPulling="2026-03-19 09:48:05.803162139 +0000 UTC m=+1540.651760726" observedRunningTime="2026-03-19 09:48:07.440525439 +0000 UTC m=+1542.289124026" watchObservedRunningTime="2026-03-19 09:48:07.462081312 +0000 UTC m=+1542.310679899" Mar 19 09:48:08 crc kubenswrapper[4835]: I0319 09:48:08.249912 4835 generic.go:334] "Generic (PLEG): container finished" podID="c5787acc-561d-4cca-b822-bc7ae5cde5ea" containerID="40ef1bdb4eda883d80badcdbbccd9dd1fa6754f0f6149ea68c7af0947900f14a" exitCode=0 Mar 19 09:48:08 crc kubenswrapper[4835]: I0319 09:48:08.249971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" event={"ID":"c5787acc-561d-4cca-b822-bc7ae5cde5ea","Type":"ContainerDied","Data":"40ef1bdb4eda883d80badcdbbccd9dd1fa6754f0f6149ea68c7af0947900f14a"} Mar 19 09:48:08 crc kubenswrapper[4835]: I0319 09:48:08.417678 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" path="/var/lib/kubelet/pods/fe4973bb-df39-4221-bbd0-f637a10dd5b0/volumes" Mar 19 09:48:09 crc kubenswrapper[4835]: I0319 09:48:09.744444 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:09 crc kubenswrapper[4835]: I0319 09:48:09.786903 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkvwz\" (UniqueName: \"kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz\") pod \"c5787acc-561d-4cca-b822-bc7ae5cde5ea\" (UID: \"c5787acc-561d-4cca-b822-bc7ae5cde5ea\") " Mar 19 09:48:09 crc kubenswrapper[4835]: I0319 09:48:09.799072 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz" (OuterVolumeSpecName: "kube-api-access-lkvwz") pod "c5787acc-561d-4cca-b822-bc7ae5cde5ea" (UID: "c5787acc-561d-4cca-b822-bc7ae5cde5ea"). InnerVolumeSpecName "kube-api-access-lkvwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:09 crc kubenswrapper[4835]: I0319 09:48:09.889349 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkvwz\" (UniqueName: \"kubernetes.io/projected/c5787acc-561d-4cca-b822-bc7ae5cde5ea-kube-api-access-lkvwz\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:10 crc kubenswrapper[4835]: I0319 09:48:10.275627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" event={"ID":"c5787acc-561d-4cca-b822-bc7ae5cde5ea","Type":"ContainerDied","Data":"97a9f72cc73d16782965ee31c8e1c0fbcbe86fec2a18af94df522b71e9f5a411"} Mar 19 09:48:10 crc kubenswrapper[4835]: I0319 09:48:10.275984 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a9f72cc73d16782965ee31c8e1c0fbcbe86fec2a18af94df522b71e9f5a411" Mar 19 09:48:10 crc kubenswrapper[4835]: I0319 09:48:10.275709 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565228-zvnlw" Mar 19 09:48:10 crc kubenswrapper[4835]: I0319 09:48:10.825106 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565222-bxrtd"] Mar 19 09:48:10 crc kubenswrapper[4835]: I0319 09:48:10.837423 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565222-bxrtd"] Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.297106 4835 generic.go:334] "Generic (PLEG): container finished" podID="410b03bf-d2bc-4992-b467-db7947c52078" containerID="dfcd6ca68762255b373479b75d2123ff0712b7d734ddffde0d6af98361a5b1b8" exitCode=0 Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.297220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mw6s2" event={"ID":"410b03bf-d2bc-4992-b467-db7947c52078","Type":"ContainerDied","Data":"dfcd6ca68762255b373479b75d2123ff0712b7d734ddffde0d6af98361a5b1b8"} Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.417304 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="151e4145-cf02-4148-ad6e-a7cebba6e4f0" path="/var/lib/kubelet/pods/151e4145-cf02-4148-ad6e-a7cebba6e4f0/volumes" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.814355 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:48:12 crc kubenswrapper[4835]: E0319 09:48:12.824544 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5787acc-561d-4cca-b822-bc7ae5cde5ea" containerName="oc" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.824632 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5787acc-561d-4cca-b822-bc7ae5cde5ea" containerName="oc" Mar 19 09:48:12 crc kubenswrapper[4835]: E0319 09:48:12.824714 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="dnsmasq-dns" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.824793 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="dnsmasq-dns" Mar 19 09:48:12 crc kubenswrapper[4835]: E0319 09:48:12.824865 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="init" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.824922 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="init" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.825170 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5787acc-561d-4cca-b822-bc7ae5cde5ea" containerName="oc" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.825283 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4973bb-df39-4221-bbd0-f637a10dd5b0" containerName="dnsmasq-dns" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.827088 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.832592 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.869351 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj9qr\" (UniqueName: \"kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.869832 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.869863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.972611 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.972669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.972906 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj9qr\" (UniqueName: \"kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.973474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:12 crc kubenswrapper[4835]: I0319 09:48:12.973776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:13 crc kubenswrapper[4835]: I0319 09:48:13.032091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj9qr\" (UniqueName: \"kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr\") pod \"redhat-operators-xf267\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:13 crc kubenswrapper[4835]: I0319 09:48:13.150288 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.040134 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.130137 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data\") pod \"410b03bf-d2bc-4992-b467-db7947c52078\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.130241 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s5np\" (UniqueName: \"kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np\") pod \"410b03bf-d2bc-4992-b467-db7947c52078\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.130473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle\") pod \"410b03bf-d2bc-4992-b467-db7947c52078\" (UID: \"410b03bf-d2bc-4992-b467-db7947c52078\") " Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.153475 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np" (OuterVolumeSpecName: "kube-api-access-4s5np") pod "410b03bf-d2bc-4992-b467-db7947c52078" (UID: "410b03bf-d2bc-4992-b467-db7947c52078"). InnerVolumeSpecName "kube-api-access-4s5np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.179508 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "410b03bf-d2bc-4992-b467-db7947c52078" (UID: "410b03bf-d2bc-4992-b467-db7947c52078"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.218850 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "410b03bf-d2bc-4992-b467-db7947c52078" (UID: "410b03bf-d2bc-4992-b467-db7947c52078"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.252176 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.252254 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s5np\" (UniqueName: \"kubernetes.io/projected/410b03bf-d2bc-4992-b467-db7947c52078-kube-api-access-4s5np\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.252273 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/410b03bf-d2bc-4992-b467-db7947c52078-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.338221 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mw6s2" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.339351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mw6s2" event={"ID":"410b03bf-d2bc-4992-b467-db7947c52078","Type":"ContainerDied","Data":"546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83"} Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.339385 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="546b9836cae5779b5bd8f750cd87dcef230bd6c28b9bb41f37badde20b15dc83" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.671166 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-88c674dc-srwpw"] Mar 19 09:48:14 crc kubenswrapper[4835]: E0319 09:48:14.672564 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410b03bf-d2bc-4992-b467-db7947c52078" containerName="barbican-db-sync" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.672588 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="410b03bf-d2bc-4992-b467-db7947c52078" containerName="barbican-db-sync" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.672853 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="410b03bf-d2bc-4992-b467-db7947c52078" containerName="barbican-db-sync" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.674020 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.680506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-966xp" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.680708 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.688114 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-768fcfbbc8-k7gmg"] Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.690041 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.693989 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.702029 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-88c674dc-srwpw"] Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.711948 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.736350 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-768fcfbbc8-k7gmg"] Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762513 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data-custom\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-combined-ca-bundle\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762700 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z4lt\" (UniqueName: \"kubernetes.io/projected/0f532ad6-0a68-4c59-93b7-5e393908c008-kube-api-access-8z4lt\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762773 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f532ad6-0a68-4c59-93b7-5e393908c008-logs\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjsj8\" (UniqueName: \"kubernetes.io/projected/52481530-0f3c-48ac-abfe-2ca7b35d8b07-kube-api-access-vjsj8\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.762904 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52481530-0f3c-48ac-abfe-2ca7b35d8b07-logs\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.763112 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-combined-ca-bundle\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.763191 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data-custom\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f532ad6-0a68-4c59-93b7-5e393908c008-logs\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865474 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjsj8\" (UniqueName: \"kubernetes.io/projected/52481530-0f3c-48ac-abfe-2ca7b35d8b07-kube-api-access-vjsj8\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865495 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865543 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52481530-0f3c-48ac-abfe-2ca7b35d8b07-logs\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865577 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-combined-ca-bundle\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865594 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data-custom\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865672 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data-custom\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865785 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-combined-ca-bundle\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.865824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z4lt\" (UniqueName: \"kubernetes.io/projected/0f532ad6-0a68-4c59-93b7-5e393908c008-kube-api-access-8z4lt\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.866451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f532ad6-0a68-4c59-93b7-5e393908c008-logs\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.879523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52481530-0f3c-48ac-abfe-2ca7b35d8b07-logs\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.888697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data-custom\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.891493 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data-custom\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.898211 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-combined-ca-bundle\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.905271 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-combined-ca-bundle\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.905646 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f532ad6-0a68-4c59-93b7-5e393908c008-config-data\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.906304 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52481530-0f3c-48ac-abfe-2ca7b35d8b07-config-data\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.915608 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjsj8\" (UniqueName: \"kubernetes.io/projected/52481530-0f3c-48ac-abfe-2ca7b35d8b07-kube-api-access-vjsj8\") pod \"barbican-worker-88c674dc-srwpw\" (UID: \"52481530-0f3c-48ac-abfe-2ca7b35d8b07\") " pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.932448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z4lt\" (UniqueName: \"kubernetes.io/projected/0f532ad6-0a68-4c59-93b7-5e393908c008-kube-api-access-8z4lt\") pod \"barbican-keystone-listener-768fcfbbc8-k7gmg\" (UID: \"0f532ad6-0a68-4c59-93b7-5e393908c008\") " pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.942206 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.944571 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.986215 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:14 crc kubenswrapper[4835]: I0319 09:48:14.998174 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-88c674dc-srwpw" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.012440 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075239 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075379 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075409 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.075442 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbpc\" (UniqueName: \"kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.079216 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.081284 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.084474 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.095536 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178120 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178190 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178241 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbpc\" (UniqueName: \"kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178312 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178335 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178756 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.178795 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179031 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179099 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp29f\" (UniqueName: \"kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179152 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179173 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.179978 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.180040 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.180439 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.201706 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbpc\" (UniqueName: \"kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc\") pod \"dnsmasq-dns-85ff748b95-fsxdb\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.284769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.284848 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.284995 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.285162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.285273 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp29f\" (UniqueName: \"kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.285686 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.288449 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.289053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.300974 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.321271 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp29f\" (UniqueName: \"kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f\") pod \"barbican-api-6fcd6bfc54-kgphr\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.361319 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:15 crc kubenswrapper[4835]: I0319 09:48:15.416718 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:17 crc kubenswrapper[4835]: I0319 09:48:17.384683 4835 generic.go:334] "Generic (PLEG): container finished" podID="89dd21d1-34a1-4d91-a7cb-32840eda818e" containerID="a9e9091d411b21811d51abd7cb26a4f3593bc327ae9bf9f264c76ea0b5104af7" exitCode=0 Mar 19 09:48:17 crc kubenswrapper[4835]: I0319 09:48:17.384957 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4jmpv" event={"ID":"89dd21d1-34a1-4d91-a7cb-32840eda818e","Type":"ContainerDied","Data":"a9e9091d411b21811d51abd7cb26a4f3593bc327ae9bf9f264c76ea0b5104af7"} Mar 19 09:48:17 crc kubenswrapper[4835]: I0319 09:48:17.629465 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-88c674dc-srwpw"] Mar 19 09:48:17 crc kubenswrapper[4835]: W0319 09:48:17.644713 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52481530_0f3c_48ac_abfe_2ca7b35d8b07.slice/crio-67b9cd96a758ebc0156c13dbb9afff178bad3b4d220d69526c21c62933deddc7 WatchSource:0}: Error finding container 67b9cd96a758ebc0156c13dbb9afff178bad3b4d220d69526c21c62933deddc7: Status 404 returned error can't find the container with id 67b9cd96a758ebc0156c13dbb9afff178bad3b4d220d69526c21c62933deddc7 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.020705 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:48:18 crc kubenswrapper[4835]: W0319 09:48:18.040287 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14dd8f89_f4d8_4618_b417_f1a802f3517d.slice/crio-c01fa43644fe9c912dc990d213caf7cee69a076315a384ef3be92ad3525d60fe WatchSource:0}: Error finding container c01fa43644fe9c912dc990d213caf7cee69a076315a384ef3be92ad3525d60fe: Status 404 returned error can't find the container with id c01fa43644fe9c912dc990d213caf7cee69a076315a384ef3be92ad3525d60fe Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.066889 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b499b5bb-ft9bz"] Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.069043 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.075369 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.075891 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.080013 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b499b5bb-ft9bz"] Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.090339 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.103579 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:18 crc kubenswrapper[4835]: W0319 09:48:18.105777 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f532ad6_0a68_4c59_93b7_5e393908c008.slice/crio-acccd5b75495a7cf0dff434e6bb51ded3e5dcdde5c7bf63118dc7c3d6f5098e3 WatchSource:0}: Error finding container acccd5b75495a7cf0dff434e6bb51ded3e5dcdde5c7bf63118dc7c3d6f5098e3: Status 404 returned error can't find the container with id acccd5b75495a7cf0dff434e6bb51ded3e5dcdde5c7bf63118dc7c3d6f5098e3 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.115165 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-768fcfbbc8-k7gmg"] Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156506 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72x6\" (UniqueName: \"kubernetes.io/projected/509b4973-14f1-43af-8a83-b4de55a65f5f-kube-api-access-s72x6\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156580 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data-custom\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156631 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156668 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-combined-ca-bundle\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-public-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156760 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-internal-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.156810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b4973-14f1-43af-8a83-b4de55a65f5f-logs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-combined-ca-bundle\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259407 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-public-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259454 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-internal-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259511 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b4973-14f1-43af-8a83-b4de55a65f5f-logs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72x6\" (UniqueName: \"kubernetes.io/projected/509b4973-14f1-43af-8a83-b4de55a65f5f-kube-api-access-s72x6\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259626 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data-custom\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.259672 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.260136 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b4973-14f1-43af-8a83-b4de55a65f5f-logs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.263427 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-public-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.263552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-internal-tls-certs\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.264214 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data-custom\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.264548 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-combined-ca-bundle\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.264667 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b4973-14f1-43af-8a83-b4de55a65f5f-config-data\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.279948 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72x6\" (UniqueName: \"kubernetes.io/projected/509b4973-14f1-43af-8a83-b4de55a65f5f-kube-api-access-s72x6\") pod \"barbican-api-7b499b5bb-ft9bz\" (UID: \"509b4973-14f1-43af-8a83-b4de55a65f5f\") " pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.428692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-88c674dc-srwpw" event={"ID":"52481530-0f3c-48ac-abfe-2ca7b35d8b07","Type":"ContainerStarted","Data":"67b9cd96a758ebc0156c13dbb9afff178bad3b4d220d69526c21c62933deddc7"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.448251 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" event={"ID":"0f532ad6-0a68-4c59-93b7-5e393908c008","Type":"ContainerStarted","Data":"acccd5b75495a7cf0dff434e6bb51ded3e5dcdde5c7bf63118dc7c3d6f5098e3"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.451362 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerStarted","Data":"c01fa43644fe9c912dc990d213caf7cee69a076315a384ef3be92ad3525d60fe"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.455627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerStarted","Data":"5d174a91e3cb7194296c19fd288717dc0b90734ae7160787a5dc7568c62e69d5"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460417 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerStarted","Data":"08054f07c39e4eef4dfc41ccda244fedb3295aad1a63ac833dece582fccea380"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460599 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-central-agent" containerID="cri-o://184d89364695c9a53b6bd8b307402f0d9347bb174b02395966e18a44e6fe8a2a" gracePeriod=30 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460691 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460734 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="proxy-httpd" containerID="cri-o://08054f07c39e4eef4dfc41ccda244fedb3295aad1a63ac833dece582fccea380" gracePeriod=30 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460815 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="sg-core" containerID="cri-o://e3b748a14f24225a9186588b569348543138f7e531869e7354ef71d69a4b25b3" gracePeriod=30 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.460850 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-notification-agent" containerID="cri-o://e35ff45d123572ad5f48d800141817f914920bd04a77a91bb7fd659079dec340" gracePeriod=30 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.463587 4835 generic.go:334] "Generic (PLEG): container finished" podID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" containerID="b7b8402d3c5a091035650dff3c5aa76455642f20a9ee98016450c1f7bbf0fa07" exitCode=0 Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.463648 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mkr8b" event={"ID":"2161ebbe-ed84-49a9-b0b1-b74304fd8b28","Type":"ContainerDied","Data":"b7b8402d3c5a091035650dff3c5aa76455642f20a9ee98016450c1f7bbf0fa07"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.467622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" event={"ID":"1d663402-04a2-42df-b860-d5c3568971d8","Type":"ContainerStarted","Data":"cd23a391f6f1bcb16f050a5e007a33911751b10c309a4fa14e19cb11d71fe375"} Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.479473 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.497712 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.509964276 podStartE2EDuration="1m2.497685119s" podCreationTimestamp="2026-03-19 09:47:16 +0000 UTC" firstStartedPulling="2026-03-19 09:47:18.954325372 +0000 UTC m=+1493.802923959" lastFinishedPulling="2026-03-19 09:48:16.942046215 +0000 UTC m=+1551.790644802" observedRunningTime="2026-03-19 09:48:18.495542842 +0000 UTC m=+1553.344141429" watchObservedRunningTime="2026-03-19 09:48:18.497685119 +0000 UTC m=+1553.346283706" Mar 19 09:48:18 crc kubenswrapper[4835]: I0319 09:48:18.886197 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.216797 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.217497 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c75cdcf85-fhsqj" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-api" containerID="cri-o://913f1f440bdd8254f62a7f864525ce98361ace610a879f24685813adab987e7e" gracePeriod=30 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.217685 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c75cdcf85-fhsqj" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" containerID="cri-o://d0b2bb052eea070f42b7e38055bcfa0b4a4bde4272a400cfe87afd8664681ca2" gracePeriod=30 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.241581 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d99fb9659-b272b"] Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.243913 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.291999 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d99fb9659-b272b"] Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.300690 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-internal-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.300968 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.301092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-ovndb-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.301260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-public-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.301483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-combined-ca-bundle\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.301584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-httpd-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.301764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d75h7\" (UniqueName: \"kubernetes.io/projected/decfbe60-e04e-46cc-9228-33bb1c831849-kube-api-access-d75h7\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.359355 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b499b5bb-ft9bz"] Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.370305 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c75cdcf85-fhsqj" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.201:9696/\": read tcp 10.217.0.2:60838->10.217.0.201:9696: read: connection reset by peer" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404118 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-combined-ca-bundle\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-httpd-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404202 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d75h7\" (UniqueName: \"kubernetes.io/projected/decfbe60-e04e-46cc-9228-33bb1c831849-kube-api-access-d75h7\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404306 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-internal-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404335 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-ovndb-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.404388 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-public-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.411035 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-ovndb-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.411059 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-public-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.411898 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-combined-ca-bundle\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.417018 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-internal-tls-certs\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.417775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-httpd-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.426476 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d75h7\" (UniqueName: \"kubernetes.io/projected/decfbe60-e04e-46cc-9228-33bb1c831849-kube-api-access-d75h7\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.431634 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/decfbe60-e04e-46cc-9228-33bb1c831849-config\") pod \"neutron-5d99fb9659-b272b\" (UID: \"decfbe60-e04e-46cc-9228-33bb1c831849\") " pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.483223 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerStarted","Data":"542bfdbad4e6ae6ac640534950a273115c1e601a5733933772a0f6544b09a1b8"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.487705 4835 generic.go:334] "Generic (PLEG): container finished" podID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerID="08054f07c39e4eef4dfc41ccda244fedb3295aad1a63ac833dece582fccea380" exitCode=0 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.487733 4835 generic.go:334] "Generic (PLEG): container finished" podID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerID="e3b748a14f24225a9186588b569348543138f7e531869e7354ef71d69a4b25b3" exitCode=2 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.487769 4835 generic.go:334] "Generic (PLEG): container finished" podID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerID="184d89364695c9a53b6bd8b307402f0d9347bb174b02395966e18a44e6fe8a2a" exitCode=0 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.488049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerDied","Data":"08054f07c39e4eef4dfc41ccda244fedb3295aad1a63ac833dece582fccea380"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.488170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerDied","Data":"e3b748a14f24225a9186588b569348543138f7e531869e7354ef71d69a4b25b3"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.488190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerDied","Data":"184d89364695c9a53b6bd8b307402f0d9347bb174b02395966e18a44e6fe8a2a"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.496587 4835 generic.go:334] "Generic (PLEG): container finished" podID="77af32b3-205b-4928-bed1-719937fee8fa" containerID="d0b2bb052eea070f42b7e38055bcfa0b4a4bde4272a400cfe87afd8664681ca2" exitCode=0 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.496656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerDied","Data":"d0b2bb052eea070f42b7e38055bcfa0b4a4bde4272a400cfe87afd8664681ca2"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.500340 4835 generic.go:334] "Generic (PLEG): container finished" podID="1d663402-04a2-42df-b860-d5c3568971d8" containerID="03ea1528931aa591c354c454985e3d886b88d000748920e9848aa73bd72f3344" exitCode=0 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.500530 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" event={"ID":"1d663402-04a2-42df-b860-d5c3568971d8","Type":"ContainerDied","Data":"03ea1528931aa591c354c454985e3d886b88d000748920e9848aa73bd72f3344"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.505118 4835 generic.go:334] "Generic (PLEG): container finished" podID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerID="245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8" exitCode=0 Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.505211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerDied","Data":"245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8"} Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.568106 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.859287 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4jmpv" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.914900 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data\") pod \"89dd21d1-34a1-4d91-a7cb-32840eda818e\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.915172 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgbf2\" (UniqueName: \"kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2\") pod \"89dd21d1-34a1-4d91-a7cb-32840eda818e\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.915248 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle\") pod \"89dd21d1-34a1-4d91-a7cb-32840eda818e\" (UID: \"89dd21d1-34a1-4d91-a7cb-32840eda818e\") " Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.919886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2" (OuterVolumeSpecName: "kube-api-access-qgbf2") pod "89dd21d1-34a1-4d91-a7cb-32840eda818e" (UID: "89dd21d1-34a1-4d91-a7cb-32840eda818e"). InnerVolumeSpecName "kube-api-access-qgbf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:19 crc kubenswrapper[4835]: I0319 09:48:19.967651 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89dd21d1-34a1-4d91-a7cb-32840eda818e" (UID: "89dd21d1-34a1-4d91-a7cb-32840eda818e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.018536 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgbf2\" (UniqueName: \"kubernetes.io/projected/89dd21d1-34a1-4d91-a7cb-32840eda818e-kube-api-access-qgbf2\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.018810 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.024046 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data" (OuterVolumeSpecName: "config-data") pod "89dd21d1-34a1-4d91-a7cb-32840eda818e" (UID: "89dd21d1-34a1-4d91-a7cb-32840eda818e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.121690 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89dd21d1-34a1-4d91-a7cb-32840eda818e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.281307 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.326082 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jsvr\" (UniqueName: \"kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.326161 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.326184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.326628 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.327311 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.327355 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle\") pod \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\" (UID: \"2161ebbe-ed84-49a9-b0b1-b74304fd8b28\") " Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.327415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.328383 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.331058 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.331840 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts" (OuterVolumeSpecName: "scripts") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.333920 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr" (OuterVolumeSpecName: "kube-api-access-8jsvr") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "kube-api-access-8jsvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.377539 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.432917 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jsvr\" (UniqueName: \"kubernetes.io/projected/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-kube-api-access-8jsvr\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.432950 4835 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.432963 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.432975 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.442352 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data" (OuterVolumeSpecName: "config-data") pod "2161ebbe-ed84-49a9-b0b1-b74304fd8b28" (UID: "2161ebbe-ed84-49a9-b0b1-b74304fd8b28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.534575 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2161ebbe-ed84-49a9-b0b1-b74304fd8b28-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.559351 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-4jmpv" event={"ID":"89dd21d1-34a1-4d91-a7cb-32840eda818e","Type":"ContainerDied","Data":"9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834"} Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.559390 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0a80a50e439143f00c83889f37a8cc4235fd62787eea1fef3ff29f8168e834" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.559451 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-4jmpv" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.574772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b499b5bb-ft9bz" event={"ID":"509b4973-14f1-43af-8a83-b4de55a65f5f","Type":"ContainerStarted","Data":"68defb58b343bc945d4420fce6333aaec2803dd62b9e9b43e579c155af2fcfa8"} Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.577464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mkr8b" event={"ID":"2161ebbe-ed84-49a9-b0b1-b74304fd8b28","Type":"ContainerDied","Data":"cb53017e265d6e0ef886b00800733b13bf896f0bfdfdb47fa62e9940428c2938"} Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.577496 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb53017e265d6e0ef886b00800733b13bf896f0bfdfdb47fa62e9940428c2938" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.577867 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mkr8b" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.792801 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:20 crc kubenswrapper[4835]: E0319 09:48:20.793361 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" containerName="heat-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.793385 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" containerName="heat-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: E0319 09:48:20.793406 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" containerName="cinder-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.793413 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" containerName="cinder-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.793662 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" containerName="cinder-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.793704 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" containerName="heat-db-sync" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.818838 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.840198 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z9xjq" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.840819 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.841065 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.841853 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.855678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.862444 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.862572 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.862846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.862993 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.863033 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979bt\" (UniqueName: \"kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972357 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972414 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972459 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.972528 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-979bt\" (UniqueName: \"kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.975609 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.982537 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.996218 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:20 crc kubenswrapper[4835]: I0319 09:48:20.996838 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.021192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.043898 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-979bt\" (UniqueName: \"kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt\") pod \"cinder-scheduler-0\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.058144 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.160310 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.233488 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.241453 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.246530 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.259531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.285619 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.288852 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293098 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293165 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjnfl\" (UniqueName: \"kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293277 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293337 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.293364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.297340 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.339792 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.382509 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d99fb9659-b272b"] Mar 19 09:48:21 crc kubenswrapper[4835]: W0319 09:48:21.413064 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddecfbe60_e04e_46cc_9228_33bb1c831849.slice/crio-bf376e17b9a56eb6419d585bead819f71e02af45c3824a72212055fb9bb7c2b9 WatchSource:0}: Error finding container bf376e17b9a56eb6419d585bead819f71e02af45c3824a72212055fb9bb7c2b9: Status 404 returned error can't find the container with id bf376e17b9a56eb6419d585bead819f71e02af45c3824a72212055fb9bb7c2b9 Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.414821 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.414855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.414899 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.414926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415015 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvvz4\" (UniqueName: \"kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415162 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415206 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415346 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415681 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjnfl\" (UniqueName: \"kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.415713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.416810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.418092 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.418317 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.439526 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.480767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjnfl\" (UniqueName: \"kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl\") pod \"dnsmasq-dns-5c9776ccc5-jc254\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519514 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519566 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519580 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519692 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvvz4\" (UniqueName: \"kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.519789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.520530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.520577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.528633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.529450 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.534236 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.557918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.596354 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvvz4\" (UniqueName: \"kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4\") pod \"cinder-api-0\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.607232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d99fb9659-b272b" event={"ID":"decfbe60-e04e-46cc-9228-33bb1c831849","Type":"ContainerStarted","Data":"bf376e17b9a56eb6419d585bead819f71e02af45c3824a72212055fb9bb7c2b9"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.626331 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerStarted","Data":"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.627854 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.647010 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerStarted","Data":"b609a8f7c4da2efdabb4bbef3d49b0da202614154807da7d91dc0e8e09b11725"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.647204 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.647258 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.647941 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.681573 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b499b5bb-ft9bz" event={"ID":"509b4973-14f1-43af-8a83-b4de55a65f5f","Type":"ContainerStarted","Data":"f89db329c6ad853729faab44996b748ffa23f6dc6af0abb6186e48a713cec778"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.681614 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b499b5bb-ft9bz" event={"ID":"509b4973-14f1-43af-8a83-b4de55a65f5f","Type":"ContainerStarted","Data":"fbd40ede5d146aaa5ef2b03923326d1878b61ed8da36652ab72943be83f684ed"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.681652 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.682777 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.709336 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" event={"ID":"1d663402-04a2-42df-b860-d5c3568971d8","Type":"ContainerStarted","Data":"87135838e4ceefdcd115401467838135533c00e10f68b6aff2354e35d13fadbc"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.709569 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.757455 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podStartSLOduration=6.757429809 podStartE2EDuration="6.757429809s" podCreationTimestamp="2026-03-19 09:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:21.742569414 +0000 UTC m=+1556.591168001" watchObservedRunningTime="2026-03-19 09:48:21.757429809 +0000 UTC m=+1556.606028406" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.769286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-88c674dc-srwpw" event={"ID":"52481530-0f3c-48ac-abfe-2ca7b35d8b07","Type":"ContainerStarted","Data":"4ea5b69fe754d2caeca9dac8aec52b3b8a4b84fa610bb00579fa2ef13bae938f"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.769951 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c75cdcf85-fhsqj" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.201:9696/\": dial tcp 10.217.0.201:9696: connect: connection refused" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.783005 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" event={"ID":"0f532ad6-0a68-4c59-93b7-5e393908c008","Type":"ContainerStarted","Data":"494a945c15ceb3ec4d0e68c1b4695202a03899fd9cbd3127dfd59656dd68ba85"} Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.793084 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b499b5bb-ft9bz" podStartSLOduration=4.793067758 podStartE2EDuration="4.793067758s" podCreationTimestamp="2026-03-19 09:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:21.773928898 +0000 UTC m=+1556.622527485" watchObservedRunningTime="2026-03-19 09:48:21.793067758 +0000 UTC m=+1556.641666345" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.825497 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" podStartSLOduration=7.825477581 podStartE2EDuration="7.825477581s" podCreationTimestamp="2026-03-19 09:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:21.814127559 +0000 UTC m=+1556.662726146" watchObservedRunningTime="2026-03-19 09:48:21.825477581 +0000 UTC m=+1556.674076168" Mar 19 09:48:21 crc kubenswrapper[4835]: I0319 09:48:21.884505 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" podStartSLOduration=5.67454334 podStartE2EDuration="7.884482812s" podCreationTimestamp="2026-03-19 09:48:14 +0000 UTC" firstStartedPulling="2026-03-19 09:48:18.112992498 +0000 UTC m=+1552.961591085" lastFinishedPulling="2026-03-19 09:48:20.32293198 +0000 UTC m=+1555.171530557" observedRunningTime="2026-03-19 09:48:21.832047406 +0000 UTC m=+1556.680645993" watchObservedRunningTime="2026-03-19 09:48:21.884482812 +0000 UTC m=+1556.733081399" Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.293837 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:22 crc kubenswrapper[4835]: W0319 09:48:22.327606 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2535c732_f740_4456_90e6_243f2218cd56.slice/crio-fea47deaf19752ea59d7d1c38d02b17c91388e4d934e33d2fa99b5cc1b495774 WatchSource:0}: Error finding container fea47deaf19752ea59d7d1c38d02b17c91388e4d934e33d2fa99b5cc1b495774: Status 404 returned error can't find the container with id fea47deaf19752ea59d7d1c38d02b17c91388e4d934e33d2fa99b5cc1b495774 Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.815608 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerStarted","Data":"fea47deaf19752ea59d7d1c38d02b17c91388e4d934e33d2fa99b5cc1b495774"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.823159 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-88c674dc-srwpw" event={"ID":"52481530-0f3c-48ac-abfe-2ca7b35d8b07","Type":"ContainerStarted","Data":"30fdce5460d124e891721ab268e909ff4ee1d8758f9eaa494a6ae01b0c722c04"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.827666 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768fcfbbc8-k7gmg" event={"ID":"0f532ad6-0a68-4c59-93b7-5e393908c008","Type":"ContainerStarted","Data":"30e98320a1c7841e4454cc21ae513bdc1525ae31e8f411de1b272669bc5d18e0"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.835454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d99fb9659-b272b" event={"ID":"decfbe60-e04e-46cc-9228-33bb1c831849","Type":"ContainerStarted","Data":"9c92b95f63cdac22c5e4941d4a3b19eecc1fef5a94ef96feae872131334e1da0"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.835503 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d99fb9659-b272b" event={"ID":"decfbe60-e04e-46cc-9228-33bb1c831849","Type":"ContainerStarted","Data":"08e8c65c12b3441e1b5760194d32e03377f21efb09b75458051c3bf2507809be"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.835662 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.848529 4835 generic.go:334] "Generic (PLEG): container finished" podID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerID="faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b" exitCode=0 Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.848591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerDied","Data":"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.864897 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.865631 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-88c674dc-srwpw" podStartSLOduration=6.155821772 podStartE2EDuration="8.865613651s" podCreationTimestamp="2026-03-19 09:48:14 +0000 UTC" firstStartedPulling="2026-03-19 09:48:17.64727708 +0000 UTC m=+1552.495875677" lastFinishedPulling="2026-03-19 09:48:20.357068969 +0000 UTC m=+1555.205667556" observedRunningTime="2026-03-19 09:48:22.860235818 +0000 UTC m=+1557.708834405" watchObservedRunningTime="2026-03-19 09:48:22.865613651 +0000 UTC m=+1557.714212238" Mar 19 09:48:22 crc kubenswrapper[4835]: W0319 09:48:22.867242 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a9665c4_9a2a_4bac_83d4_d0dfa748f4ce.slice/crio-e33802d1a08a5e3c12b0ca4e040a3cea987dc1f9226d2dd4d8e91d008ae991a7 WatchSource:0}: Error finding container e33802d1a08a5e3c12b0ca4e040a3cea987dc1f9226d2dd4d8e91d008ae991a7: Status 404 returned error can't find the container with id e33802d1a08a5e3c12b0ca4e040a3cea987dc1f9226d2dd4d8e91d008ae991a7 Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.872077 4835 generic.go:334] "Generic (PLEG): container finished" podID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerID="e35ff45d123572ad5f48d800141817f914920bd04a77a91bb7fd659079dec340" exitCode=0 Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.872272 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="dnsmasq-dns" containerID="cri-o://87135838e4ceefdcd115401467838135533c00e10f68b6aff2354e35d13fadbc" gracePeriod=10 Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.873776 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerDied","Data":"e35ff45d123572ad5f48d800141817f914920bd04a77a91bb7fd659079dec340"} Mar 19 09:48:22 crc kubenswrapper[4835]: I0319 09:48:22.964702 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d99fb9659-b272b" podStartSLOduration=3.9646822779999997 podStartE2EDuration="3.964682278s" podCreationTimestamp="2026-03-19 09:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:22.885294375 +0000 UTC m=+1557.733892962" watchObservedRunningTime="2026-03-19 09:48:22.964682278 +0000 UTC m=+1557.813280865" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.028962 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.099166 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.259982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5q4d\" (UniqueName: \"kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260410 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260441 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260558 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260690 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data\") pod \"24665d64-cd24-4bb5-a1e4-48c734e7c525\" (UID: \"24665d64-cd24-4bb5-a1e4-48c734e7c525\") " Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.260784 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.261034 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.262025 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.262050 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24665d64-cd24-4bb5-a1e4-48c734e7c525-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.264958 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d" (OuterVolumeSpecName: "kube-api-access-v5q4d") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "kube-api-access-v5q4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.265330 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts" (OuterVolumeSpecName: "scripts") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.365209 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.365251 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5q4d\" (UniqueName: \"kubernetes.io/projected/24665d64-cd24-4bb5-a1e4-48c734e7c525-kube-api-access-v5q4d\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.449788 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.539944 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.569591 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.575824 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.622569 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data" (OuterVolumeSpecName: "config-data") pod "24665d64-cd24-4bb5-a1e4-48c734e7c525" (UID: "24665d64-cd24-4bb5-a1e4-48c734e7c525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.671724 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.671832 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24665d64-cd24-4bb5-a1e4-48c734e7c525-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.899085 4835 generic.go:334] "Generic (PLEG): container finished" podID="1d663402-04a2-42df-b860-d5c3568971d8" containerID="87135838e4ceefdcd115401467838135533c00e10f68b6aff2354e35d13fadbc" exitCode=0 Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.899150 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" event={"ID":"1d663402-04a2-42df-b860-d5c3568971d8","Type":"ContainerDied","Data":"87135838e4ceefdcd115401467838135533c00e10f68b6aff2354e35d13fadbc"} Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.901251 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerStarted","Data":"9b8794d9a228a998a09b1331c85aab45494f29dc9474a3b28ac602997956de22"} Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.902911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" event={"ID":"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce","Type":"ContainerStarted","Data":"e33802d1a08a5e3c12b0ca4e040a3cea987dc1f9226d2dd4d8e91d008ae991a7"} Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.906999 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24665d64-cd24-4bb5-a1e4-48c734e7c525","Type":"ContainerDied","Data":"c10ef31bd99a4afa96bd1f7ae669d1f9d52c4cef490465b3e965864f39207d3f"} Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.907087 4835 scope.go:117] "RemoveContainer" containerID="08054f07c39e4eef4dfc41ccda244fedb3295aad1a63ac833dece582fccea380" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.907165 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.990067 4835 scope.go:117] "RemoveContainer" containerID="e3b748a14f24225a9186588b569348543138f7e531869e7354ef71d69a4b25b3" Mar 19 09:48:23 crc kubenswrapper[4835]: I0319 09:48:23.995362 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.022820 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.044904 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:24 crc kubenswrapper[4835]: E0319 09:48:24.045460 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-central-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045477 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-central-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: E0319 09:48:24.045493 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="sg-core" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045500 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="sg-core" Mar 19 09:48:24 crc kubenswrapper[4835]: E0319 09:48:24.045512 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="proxy-httpd" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045519 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="proxy-httpd" Mar 19 09:48:24 crc kubenswrapper[4835]: E0319 09:48:24.045538 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-notification-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045549 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-notification-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045823 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-central-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045849 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="sg-core" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045862 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="proxy-httpd" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.045873 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" containerName="ceilometer-notification-agent" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.047875 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.051506 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.051718 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.068441 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.148905 4835 scope.go:117] "RemoveContainer" containerID="e35ff45d123572ad5f48d800141817f914920bd04a77a91bb7fd659079dec340" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182196 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182293 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6w9n\" (UniqueName: \"kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182335 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182375 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.182503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.249384 4835 scope.go:117] "RemoveContainer" containerID="184d89364695c9a53b6bd8b307402f0d9347bb174b02395966e18a44e6fe8a2a" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284641 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284710 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284751 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284823 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284866 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284897 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.284963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6w9n\" (UniqueName: \"kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.285580 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.286485 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.291386 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.300324 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.301647 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.305409 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6w9n\" (UniqueName: \"kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.307674 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data\") pod \"ceilometer-0\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.388107 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.426825 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24665d64-cd24-4bb5-a1e4-48c734e7c525" path="/var/lib/kubelet/pods/24665d64-cd24-4bb5-a1e4-48c734e7c525/volumes" Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.935929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerStarted","Data":"ab60b8fb20355180de32580a4d58699478d0b3b1ddf393c22973d0393f3ae459"} Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.943151 4835 generic.go:334] "Generic (PLEG): container finished" podID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerID="de4348dd2fff7fd16cfb75ef1e4aa8b57dda45252f7dcaef86a6a6d7f36becbc" exitCode=0 Mar 19 09:48:24 crc kubenswrapper[4835]: I0319 09:48:24.943314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" event={"ID":"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce","Type":"ContainerDied","Data":"de4348dd2fff7fd16cfb75ef1e4aa8b57dda45252f7dcaef86a6a6d7f36becbc"} Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.004880 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.287923 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.414759 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.414810 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.414837 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.414894 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.414946 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbpc\" (UniqueName: \"kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.415062 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0\") pod \"1d663402-04a2-42df-b860-d5c3568971d8\" (UID: \"1d663402-04a2-42df-b860-d5c3568971d8\") " Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.429153 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc" (OuterVolumeSpecName: "kube-api-access-cfbpc") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "kube-api-access-cfbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.508524 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.523137 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbpc\" (UniqueName: \"kubernetes.io/projected/1d663402-04a2-42df-b860-d5c3568971d8-kube-api-access-cfbpc\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.523172 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.523188 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.533529 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.578325 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.579288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config" (OuterVolumeSpecName: "config") pod "1d663402-04a2-42df-b860-d5c3568971d8" (UID: "1d663402-04a2-42df-b860-d5c3568971d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.625569 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.625602 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.625612 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:25 crc kubenswrapper[4835]: I0319 09:48:25.625620 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d663402-04a2-42df-b860-d5c3568971d8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.020932 4835 generic.go:334] "Generic (PLEG): container finished" podID="77af32b3-205b-4928-bed1-719937fee8fa" containerID="913f1f440bdd8254f62a7f864525ce98361ace610a879f24685813adab987e7e" exitCode=0 Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.022314 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerDied","Data":"913f1f440bdd8254f62a7f864525ce98361ace610a879f24685813adab987e7e"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.048214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerStarted","Data":"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.048257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerStarted","Data":"9f2cf572a2a930b36b442d667f2d103da5b8bf5f1127e5fd2cba0771f3882ef1"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.056009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" event={"ID":"1d663402-04a2-42df-b860-d5c3568971d8","Type":"ContainerDied","Data":"cd23a391f6f1bcb16f050a5e007a33911751b10c309a4fa14e19cb11d71fe375"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.056068 4835 scope.go:117] "RemoveContainer" containerID="87135838e4ceefdcd115401467838135533c00e10f68b6aff2354e35d13fadbc" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.056212 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-fsxdb" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.091933 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerStarted","Data":"8cf685ea5da00d9509cf3f90a909cc8bfd06aac59bd4c388f84d3926a255f510"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.092116 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api-log" containerID="cri-o://ab60b8fb20355180de32580a4d58699478d0b3b1ddf393c22973d0393f3ae459" gracePeriod=30 Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.092460 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.092496 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" containerID="cri-o://8cf685ea5da00d9509cf3f90a909cc8bfd06aac59bd4c388f84d3926a255f510" gracePeriod=30 Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.122566 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerStarted","Data":"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.134061 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.146188 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-fsxdb"] Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.164698 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.164679518 podStartE2EDuration="5.164679518s" podCreationTimestamp="2026-03-19 09:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:26.141372187 +0000 UTC m=+1560.989970774" watchObservedRunningTime="2026-03-19 09:48:26.164679518 +0000 UTC m=+1561.013278105" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.198117 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xf267" podStartSLOduration=8.347526016 podStartE2EDuration="14.198096617s" podCreationTimestamp="2026-03-19 09:48:12 +0000 UTC" firstStartedPulling="2026-03-19 09:48:19.66328411 +0000 UTC m=+1554.511882697" lastFinishedPulling="2026-03-19 09:48:25.513854711 +0000 UTC m=+1560.362453298" observedRunningTime="2026-03-19 09:48:26.168273344 +0000 UTC m=+1561.016871961" watchObservedRunningTime="2026-03-19 09:48:26.198096617 +0000 UTC m=+1561.046695204" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.209110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" event={"ID":"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce","Type":"ContainerStarted","Data":"c4f71fcf5f1297660aeb3e4dd2aaf4bc8dd2376fe215c7df2e5be800dc5ca8c8"} Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.210470 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.243371 4835 scope.go:117] "RemoveContainer" containerID="03ea1528931aa591c354c454985e3d886b88d000748920e9848aa73bd72f3344" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.243722 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" podStartSLOduration=6.243705882 podStartE2EDuration="6.243705882s" podCreationTimestamp="2026-03-19 09:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:26.236653454 +0000 UTC m=+1561.085252041" watchObservedRunningTime="2026-03-19 09:48:26.243705882 +0000 UTC m=+1561.092304469" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.460034 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d663402-04a2-42df-b860-d5c3568971d8" path="/var/lib/kubelet/pods/1d663402-04a2-42df-b860-d5c3568971d8/volumes" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.502919 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.657418 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.657554 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.657605 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.658392 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.658536 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzr8n\" (UniqueName: \"kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.658635 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.658724 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle\") pod \"77af32b3-205b-4928-bed1-719937fee8fa\" (UID: \"77af32b3-205b-4928-bed1-719937fee8fa\") " Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.664340 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.664564 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n" (OuterVolumeSpecName: "kube-api-access-gzr8n") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "kube-api-access-gzr8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.763583 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzr8n\" (UniqueName: \"kubernetes.io/projected/77af32b3-205b-4928-bed1-719937fee8fa-kube-api-access-gzr8n\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.763627 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.803860 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.819072 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config" (OuterVolumeSpecName: "config") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.836873 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.851037 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.865499 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.865537 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.865546 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.865556 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.901563 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "77af32b3-205b-4928-bed1-719937fee8fa" (UID: "77af32b3-205b-4928-bed1-719937fee8fa"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:26 crc kubenswrapper[4835]: I0319 09:48:26.967800 4835 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/77af32b3-205b-4928-bed1-719937fee8fa-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.244902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c75cdcf85-fhsqj" event={"ID":"77af32b3-205b-4928-bed1-719937fee8fa","Type":"ContainerDied","Data":"0606f1448058cf48375352f314d9b06cc0102936f45cfd61d0ebf7c2a253d270"} Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.244952 4835 scope.go:117] "RemoveContainer" containerID="d0b2bb052eea070f42b7e38055bcfa0b4a4bde4272a400cfe87afd8664681ca2" Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.245078 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c75cdcf85-fhsqj" Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.287969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerStarted","Data":"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729"} Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.315828 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.326321 4835 generic.go:334] "Generic (PLEG): container finished" podID="46107793-2582-4626-bf6e-c3e990e07ee4" containerID="ab60b8fb20355180de32580a4d58699478d0b3b1ddf393c22973d0393f3ae459" exitCode=143 Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.326436 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerDied","Data":"ab60b8fb20355180de32580a4d58699478d0b3b1ddf393c22973d0393f3ae459"} Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.361330 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerStarted","Data":"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a"} Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.365854 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c75cdcf85-fhsqj"] Mar 19 09:48:27 crc kubenswrapper[4835]: I0319 09:48:27.375918 4835 scope.go:117] "RemoveContainer" containerID="913f1f440bdd8254f62a7f864525ce98361ace610a879f24685813adab987e7e" Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.378758 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerStarted","Data":"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4"} Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.382690 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerStarted","Data":"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2"} Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.414923 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.421222327 podStartE2EDuration="8.414901734s" podCreationTimestamp="2026-03-19 09:48:20 +0000 UTC" firstStartedPulling="2026-03-19 09:48:22.38928217 +0000 UTC m=+1557.237880757" lastFinishedPulling="2026-03-19 09:48:25.382961557 +0000 UTC m=+1560.231560164" observedRunningTime="2026-03-19 09:48:28.406377857 +0000 UTC m=+1563.254976454" watchObservedRunningTime="2026-03-19 09:48:28.414901734 +0000 UTC m=+1563.263500321" Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.419247 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77af32b3-205b-4928-bed1-719937fee8fa" path="/var/lib/kubelet/pods/77af32b3-205b-4928-bed1-719937fee8fa/volumes" Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.509216 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:28 crc kubenswrapper[4835]: I0319 09:48:28.820120 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:30 crc kubenswrapper[4835]: I0319 09:48:30.415635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerStarted","Data":"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597"} Mar 19 09:48:30 crc kubenswrapper[4835]: I0319 09:48:30.416202 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:48:30 crc kubenswrapper[4835]: I0319 09:48:30.438274 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9634188200000002 podStartE2EDuration="7.438249459s" podCreationTimestamp="2026-03-19 09:48:23 +0000 UTC" firstStartedPulling="2026-03-19 09:48:24.99619225 +0000 UTC m=+1559.844790837" lastFinishedPulling="2026-03-19 09:48:29.471022889 +0000 UTC m=+1564.319621476" observedRunningTime="2026-03-19 09:48:30.435459354 +0000 UTC m=+1565.284057951" watchObservedRunningTime="2026-03-19 09:48:30.438249459 +0000 UTC m=+1565.286848056" Mar 19 09:48:30 crc kubenswrapper[4835]: I0319 09:48:30.737792 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:31 crc kubenswrapper[4835]: I0319 09:48:31.243129 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 09:48:31 crc kubenswrapper[4835]: I0319 09:48:31.244400 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": dial tcp 10.217.0.213:8080: connect: connection refused" Mar 19 09:48:31 crc kubenswrapper[4835]: I0319 09:48:31.629920 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:48:31 crc kubenswrapper[4835]: I0319 09:48:31.700800 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:48:31 crc kubenswrapper[4835]: I0319 09:48:31.701075 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="dnsmasq-dns" containerID="cri-o://2e5bea20da4fe0d37d396c723aae58e858ade156b04de961ee99399f03ea7842" gracePeriod=10 Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.438869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b499b5bb-ft9bz" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.443995 4835 generic.go:334] "Generic (PLEG): container finished" podID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerID="2e5bea20da4fe0d37d396c723aae58e858ade156b04de961ee99399f03ea7842" exitCode=0 Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.444035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" event={"ID":"39529821-be7f-4bb6-9e90-3b6ba425e639","Type":"ContainerDied","Data":"2e5bea20da4fe0d37d396c723aae58e858ade156b04de961ee99399f03ea7842"} Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.444060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" event={"ID":"39529821-be7f-4bb6-9e90-3b6ba425e639","Type":"ContainerDied","Data":"fbcdc183374d0c00688ab5d93fc32abb97157a1e691d27d397a91be1b21f9240"} Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.444070 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcdc183374d0c00688ab5d93fc32abb97157a1e691d27d397a91be1b21f9240" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.536303 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.536799 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" containerID="cri-o://542bfdbad4e6ae6ac640534950a273115c1e601a5733933772a0f6544b09a1b8" gracePeriod=30 Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.537233 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" containerID="cri-o://b609a8f7c4da2efdabb4bbef3d49b0da202614154807da7d91dc0e8e09b11725" gracePeriod=30 Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.570158 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": EOF" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.570564 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": EOF" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.570682 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": EOF" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.596139 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": EOF" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.601015 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": EOF" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.609459 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.673632 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.673896 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.673926 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzrbk\" (UniqueName: \"kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.673950 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.674002 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.674084 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0\") pod \"39529821-be7f-4bb6-9e90-3b6ba425e639\" (UID: \"39529821-be7f-4bb6-9e90-3b6ba425e639\") " Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.739961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk" (OuterVolumeSpecName: "kube-api-access-fzrbk") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "kube-api-access-fzrbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.787732 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzrbk\" (UniqueName: \"kubernetes.io/projected/39529821-be7f-4bb6-9e90-3b6ba425e639-kube-api-access-fzrbk\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.814061 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.834584 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.884586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.896863 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.937190 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.978322 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.982598 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:32 crc kubenswrapper[4835]: I0319 09:48:32.988207 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config" (OuterVolumeSpecName: "config") pod "39529821-be7f-4bb6-9e90-3b6ba425e639" (UID: "39529821-be7f-4bb6-9e90-3b6ba425e639"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.007871 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-74ff56c748-fkbqq" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.010035 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.010059 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.010069 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.010080 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39529821-be7f-4bb6-9e90-3b6ba425e639-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.118774 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.152704 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.152812 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.474505 4835 generic.go:334] "Generic (PLEG): container finished" podID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerID="542bfdbad4e6ae6ac640534950a273115c1e601a5733933772a0f6544b09a1b8" exitCode=143 Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.474596 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-sjjm8" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.477158 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerDied","Data":"542bfdbad4e6ae6ac640534950a273115c1e601a5733933772a0f6544b09a1b8"} Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.477241 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.477236 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-cbc57b49b-x9wm9" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-log" containerID="cri-o://e209a3265c7448cd5c4f37948a8d87e968509ae27d3bc659f8c1edf93ba1b1cd" gracePeriod=30 Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.477341 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-cbc57b49b-x9wm9" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-api" containerID="cri-o://966fdc1b648f492ef2f8c8a23310243a1a650bd7c102ab114949cf3db6294c66" gracePeriod=30 Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.491820 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-cbc57b49b-x9wm9" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.203:8778/\": EOF" Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.553834 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:48:33 crc kubenswrapper[4835]: I0319 09:48:33.573812 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-sjjm8"] Mar 19 09:48:34 crc kubenswrapper[4835]: I0319 09:48:34.358982 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:48:34 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:48:34 crc kubenswrapper[4835]: > Mar 19 09:48:34 crc kubenswrapper[4835]: I0319 09:48:34.415162 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" path="/var/lib/kubelet/pods/39529821-be7f-4bb6-9e90-3b6ba425e639/volumes" Mar 19 09:48:34 crc kubenswrapper[4835]: I0319 09:48:34.486509 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerID="e209a3265c7448cd5c4f37948a8d87e968509ae27d3bc659f8c1edf93ba1b1cd" exitCode=143 Mar 19 09:48:34 crc kubenswrapper[4835]: I0319 09:48:34.486555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerDied","Data":"e209a3265c7448cd5c4f37948a8d87e968509ae27d3bc659f8c1edf93ba1b1cd"} Mar 19 09:48:34 crc kubenswrapper[4835]: I0319 09:48:34.831266 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-f8d94d56-dsx7s" Mar 19 09:48:35 crc kubenswrapper[4835]: I0319 09:48:35.485232 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 09:48:36 crc kubenswrapper[4835]: I0319 09:48:36.422753 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:48:36 crc kubenswrapper[4835]: I0319 09:48:36.423324 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:48:36 crc kubenswrapper[4835]: I0319 09:48:36.505241 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 09:48:36 crc kubenswrapper[4835]: I0319 09:48:36.554911 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.542898 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerID="966fdc1b648f492ef2f8c8a23310243a1a650bd7c102ab114949cf3db6294c66" exitCode=0 Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.543065 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerDied","Data":"966fdc1b648f492ef2f8c8a23310243a1a650bd7c102ab114949cf3db6294c66"} Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.543332 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="cinder-scheduler" containerID="cri-o://ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a" gracePeriod=30 Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.543405 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="probe" containerID="cri-o://00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2" gracePeriod=30 Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.779074 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945059 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945149 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945283 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckdtd\" (UniqueName: \"kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945382 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.945444 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data\") pod \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\" (UID: \"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed\") " Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.946077 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs" (OuterVolumeSpecName: "logs") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.957970 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd" (OuterVolumeSpecName: "kube-api-access-ckdtd") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "kube-api-access-ckdtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:37 crc kubenswrapper[4835]: I0319 09:48:37.961375 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts" (OuterVolumeSpecName: "scripts") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.049397 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.049431 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckdtd\" (UniqueName: \"kubernetes.io/projected/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-kube-api-access-ckdtd\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.049442 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.068898 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data" (OuterVolumeSpecName: "config-data") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.086908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.139382 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": read tcp 10.217.0.2:55750->10.217.0.210:9311: read: connection reset by peer" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.139432 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6fcd6bfc54-kgphr" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.210:9311/healthcheck\": read tcp 10.217.0.2:55734->10.217.0.210:9311: read: connection reset by peer" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.139856 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.143542 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.151339 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.151372 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.151384 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.152881 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" (UID: "5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.261887 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.585057 4835 generic.go:334] "Generic (PLEG): container finished" podID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerID="b609a8f7c4da2efdabb4bbef3d49b0da202614154807da7d91dc0e8e09b11725" exitCode=0 Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.585868 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerDied","Data":"b609a8f7c4da2efdabb4bbef3d49b0da202614154807da7d91dc0e8e09b11725"} Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.588335 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6fcd6bfc54-kgphr" event={"ID":"590489a5-8799-4ae0-8f08-d2d1fc0480f4","Type":"ContainerDied","Data":"5d174a91e3cb7194296c19fd288717dc0b90734ae7160787a5dc7568c62e69d5"} Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.588633 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d174a91e3cb7194296c19fd288717dc0b90734ae7160787a5dc7568c62e69d5" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.621588 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-cbc57b49b-x9wm9" event={"ID":"5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed","Type":"ContainerDied","Data":"e14bf0100a40e1a9dbbcb93b30f85a3b810c595a758b2ba4447151cb6d5c7681"} Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.621641 4835 scope.go:117] "RemoveContainer" containerID="966fdc1b648f492ef2f8c8a23310243a1a650bd7c102ab114949cf3db6294c66" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.621831 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-cbc57b49b-x9wm9" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.743061 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.762259 4835 scope.go:117] "RemoveContainer" containerID="e209a3265c7448cd5c4f37948a8d87e968509ae27d3bc659f8c1edf93ba1b1cd" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.773319 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.799102 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-cbc57b49b-x9wm9"] Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.892860 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp29f\" (UniqueName: \"kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f\") pod \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.893015 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data\") pod \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.893183 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle\") pod \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.893311 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs\") pod \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.893387 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom\") pod \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\" (UID: \"590489a5-8799-4ae0-8f08-d2d1fc0480f4\") " Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.895243 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs" (OuterVolumeSpecName: "logs") pod "590489a5-8799-4ae0-8f08-d2d1fc0480f4" (UID: "590489a5-8799-4ae0-8f08-d2d1fc0480f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.901190 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "590489a5-8799-4ae0-8f08-d2d1fc0480f4" (UID: "590489a5-8799-4ae0-8f08-d2d1fc0480f4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.903945 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f" (OuterVolumeSpecName: "kube-api-access-xp29f") pod "590489a5-8799-4ae0-8f08-d2d1fc0480f4" (UID: "590489a5-8799-4ae0-8f08-d2d1fc0480f4"). InnerVolumeSpecName "kube-api-access-xp29f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.954879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "590489a5-8799-4ae0-8f08-d2d1fc0480f4" (UID: "590489a5-8799-4ae0-8f08-d2d1fc0480f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.984882 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data" (OuterVolumeSpecName: "config-data") pod "590489a5-8799-4ae0-8f08-d2d1fc0480f4" (UID: "590489a5-8799-4ae0-8f08-d2d1fc0480f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.996329 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.996365 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/590489a5-8799-4ae0-8f08-d2d1fc0480f4-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.996378 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.996390 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp29f\" (UniqueName: \"kubernetes.io/projected/590489a5-8799-4ae0-8f08-d2d1fc0480f4-kube-api-access-xp29f\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:38 crc kubenswrapper[4835]: I0319 09:48:38.996402 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/590489a5-8799-4ae0-8f08-d2d1fc0480f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440114 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440857 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="init" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440874 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="init" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440899 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440906 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440921 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440927 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-api" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440951 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440957 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440973 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440980 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.440993 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.440998 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.441018 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441025 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-api" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.441037 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="init" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441042 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="init" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.441057 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441063 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" Mar 19 09:48:39 crc kubenswrapper[4835]: E0319 09:48:39.441073 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-log" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441079 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-log" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441296 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441312 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d663402-04a2-42df-b860-d5c3568971d8" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441321 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="39529821-be7f-4bb6-9e90-3b6ba425e639" containerName="dnsmasq-dns" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441338 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" containerName="barbican-api-log" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441352 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-httpd" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441364 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441408 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="77af32b3-205b-4928-bed1-719937fee8fa" containerName="neutron-api" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.441420 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" containerName="placement-log" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.442305 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.444919 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.445234 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hc7dh" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.445543 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.457303 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.612427 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config-secret\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.612622 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.612721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69jz\" (UniqueName: \"kubernetes.io/projected/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-kube-api-access-f69jz\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.612886 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.634724 4835 generic.go:334] "Generic (PLEG): container finished" podID="2535c732-f740-4456-90e6-243f2218cd56" containerID="00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2" exitCode=0 Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.634777 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerDied","Data":"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2"} Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.636938 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6fcd6bfc54-kgphr" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.679981 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.692970 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6fcd6bfc54-kgphr"] Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.714885 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config-secret\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.714982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.715036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f69jz\" (UniqueName: \"kubernetes.io/projected/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-kube-api-access-f69jz\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.715075 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.715951 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.720430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.721416 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-openstack-config-secret\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.748735 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69jz\" (UniqueName: \"kubernetes.io/projected/636dd5f8-5cc2-46f4-84ac-a094b6881a4f-kube-api-access-f69jz\") pod \"openstackclient\" (UID: \"636dd5f8-5cc2-46f4-84ac-a094b6881a4f\") " pod="openstack/openstackclient" Mar 19 09:48:39 crc kubenswrapper[4835]: I0319 09:48:39.776565 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.345872 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 09:48:40 crc kubenswrapper[4835]: W0319 09:48:40.346557 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636dd5f8_5cc2_46f4_84ac_a094b6881a4f.slice/crio-19eb561d1d1c0f7cc13a95c0ec4672cd320a190ff370b0cd684537f16b3e389f WatchSource:0}: Error finding container 19eb561d1d1c0f7cc13a95c0ec4672cd320a190ff370b0cd684537f16b3e389f: Status 404 returned error can't find the container with id 19eb561d1d1c0f7cc13a95c0ec4672cd320a190ff370b0cd684537f16b3e389f Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.428833 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590489a5-8799-4ae0-8f08-d2d1fc0480f4" path="/var/lib/kubelet/pods/590489a5-8799-4ae0-8f08-d2d1fc0480f4/volumes" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.430854 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed" path="/var/lib/kubelet/pods/5d7a5b26-9f5d-4d6e-a4fe-b48ae37ab5ed/volumes" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.620272 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.651476 4835 generic.go:334] "Generic (PLEG): container finished" podID="2535c732-f740-4456-90e6-243f2218cd56" containerID="ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a" exitCode=0 Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.651554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerDied","Data":"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a"} Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.651576 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.651595 4835 scope.go:117] "RemoveContainer" containerID="00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.651585 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2535c732-f740-4456-90e6-243f2218cd56","Type":"ContainerDied","Data":"fea47deaf19752ea59d7d1c38d02b17c91388e4d934e33d2fa99b5cc1b495774"} Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.654551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"636dd5f8-5cc2-46f4-84ac-a094b6881a4f","Type":"ContainerStarted","Data":"19eb561d1d1c0f7cc13a95c0ec4672cd320a190ff370b0cd684537f16b3e389f"} Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.682994 4835 scope.go:117] "RemoveContainer" containerID="ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.711027 4835 scope.go:117] "RemoveContainer" containerID="00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2" Mar 19 09:48:40 crc kubenswrapper[4835]: E0319 09:48:40.711860 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2\": container with ID starting with 00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2 not found: ID does not exist" containerID="00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.711893 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2"} err="failed to get container status \"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2\": rpc error: code = NotFound desc = could not find container \"00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2\": container with ID starting with 00265516cf0b9e7c3498d1933224895db6359af341b8f991b5262069df248ee2 not found: ID does not exist" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.711913 4835 scope.go:117] "RemoveContainer" containerID="ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a" Mar 19 09:48:40 crc kubenswrapper[4835]: E0319 09:48:40.712838 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a\": container with ID starting with ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a not found: ID does not exist" containerID="ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.712865 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a"} err="failed to get container status \"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a\": rpc error: code = NotFound desc = could not find container \"ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a\": container with ID starting with ce8b3547a80cc5435ebb09a62931bc5be08ac5e85c5a73c3dcbde272dfdec36a not found: ID does not exist" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776767 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776798 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776819 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776852 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.776981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-979bt\" (UniqueName: \"kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt\") pod \"2535c732-f740-4456-90e6-243f2218cd56\" (UID: \"2535c732-f740-4456-90e6-243f2218cd56\") " Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.777494 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2535c732-f740-4456-90e6-243f2218cd56-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.785447 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.786083 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt" (OuterVolumeSpecName: "kube-api-access-979bt") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "kube-api-access-979bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.801800 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts" (OuterVolumeSpecName: "scripts") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.879246 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.879287 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-979bt\" (UniqueName: \"kubernetes.io/projected/2535c732-f740-4456-90e6-243f2218cd56-kube-api-access-979bt\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.879300 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.882712 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5f95f9444c-phdqp"] Mar 19 09:48:40 crc kubenswrapper[4835]: E0319 09:48:40.883557 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="probe" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.883577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="probe" Mar 19 09:48:40 crc kubenswrapper[4835]: E0319 09:48:40.883623 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="cinder-scheduler" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.883630 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="cinder-scheduler" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.884327 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="probe" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.884352 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2535c732-f740-4456-90e6-243f2218cd56" containerName="cinder-scheduler" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.886105 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.887853 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.889259 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.889609 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.891668 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.946635 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f95f9444c-phdqp"] Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.965097 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data" (OuterVolumeSpecName: "config-data") pod "2535c732-f740-4456-90e6-243f2218cd56" (UID: "2535c732-f740-4456-90e6-243f2218cd56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.987032 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:40 crc kubenswrapper[4835]: I0319 09:48:40.987066 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2535c732-f740-4456-90e6-243f2218cd56-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.089300 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-combined-ca-bundle\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.089799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmgk\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-kube-api-access-2mmgk\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.089913 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-public-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.090017 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-run-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.090107 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-internal-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.090232 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-log-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.090310 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-etc-swift\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.090398 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-config-data\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193712 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-log-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193769 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-etc-swift\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193800 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-config-data\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193885 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-combined-ca-bundle\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193914 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmgk\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-kube-api-access-2mmgk\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193959 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-public-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.193976 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-run-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.194005 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-internal-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.199108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-internal-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.200513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-combined-ca-bundle\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.201209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-run-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.202699 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/844aa06e-23ea-4908-9a95-c3bda3620d5c-log-httpd\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.207696 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-public-tls-certs\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.208180 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-etc-swift\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.217839 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmgk\" (UniqueName: \"kubernetes.io/projected/844aa06e-23ea-4908-9a95-c3bda3620d5c-kube-api-access-2mmgk\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.218405 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/844aa06e-23ea-4908-9a95-c3bda3620d5c-config-data\") pod \"swift-proxy-5f95f9444c-phdqp\" (UID: \"844aa06e-23ea-4908-9a95-c3bda3620d5c\") " pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.229795 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.302497 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.328991 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.347915 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.350259 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.358629 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.392518 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402462 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402536 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402579 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrkb\" (UniqueName: \"kubernetes.io/projected/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-kube-api-access-lrrkb\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402638 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.402658 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.504472 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.504878 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrkb\" (UniqueName: \"kubernetes.io/projected/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-kube-api-access-lrrkb\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.504986 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.505009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.505094 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.505147 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.507219 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.510660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.511639 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.516942 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-config-data\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.533903 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-scripts\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.547844 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrkb\" (UniqueName: \"kubernetes.io/projected/a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc-kube-api-access-lrrkb\") pod \"cinder-scheduler-0\" (UID: \"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc\") " pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.731133 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 09:48:41 crc kubenswrapper[4835]: I0319 09:48:41.942923 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5f95f9444c-phdqp"] Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.356369 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 09:48:42 crc kubenswrapper[4835]: W0319 09:48:42.377720 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83f49e0_e79c_4a80_ad8b_b86b10c3f4dc.slice/crio-a6c02211665c9493bb7b4938246f6ca6c50b769a1ec11aa7e9fe306ceb717e95 WatchSource:0}: Error finding container a6c02211665c9493bb7b4938246f6ca6c50b769a1ec11aa7e9fe306ceb717e95: Status 404 returned error can't find the container with id a6c02211665c9493bb7b4938246f6ca6c50b769a1ec11aa7e9fe306ceb717e95 Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.420251 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2535c732-f740-4456-90e6-243f2218cd56" path="/var/lib/kubelet/pods/2535c732-f740-4456-90e6-243f2218cd56/volumes" Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.700875 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc","Type":"ContainerStarted","Data":"a6c02211665c9493bb7b4938246f6ca6c50b769a1ec11aa7e9fe306ceb717e95"} Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.729492 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f95f9444c-phdqp" event={"ID":"844aa06e-23ea-4908-9a95-c3bda3620d5c","Type":"ContainerStarted","Data":"c64f36e06a4bc47e27a2d0d80b5389bff882c972045e1d6363aa7b410169e549"} Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.729544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f95f9444c-phdqp" event={"ID":"844aa06e-23ea-4908-9a95-c3bda3620d5c","Type":"ContainerStarted","Data":"bb3d18543336dbc888a9614b4b4434f8d88b6d0bd150378095ff57c5dcc7e2bf"} Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.729553 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5f95f9444c-phdqp" event={"ID":"844aa06e-23ea-4908-9a95-c3bda3620d5c","Type":"ContainerStarted","Data":"5cd9c4616b0786cc2fc717c2dac397d04fb719f3fdcc48107c123d99842d8a89"} Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.729946 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.729996 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:42 crc kubenswrapper[4835]: I0319 09:48:42.764979 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5f95f9444c-phdqp" podStartSLOduration=2.764959347 podStartE2EDuration="2.764959347s" podCreationTimestamp="2026-03-19 09:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:42.762621274 +0000 UTC m=+1577.611219861" watchObservedRunningTime="2026-03-19 09:48:42.764959347 +0000 UTC m=+1577.613557934" Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.068005 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.068411 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-central-agent" containerID="cri-o://628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635" gracePeriod=30 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.068934 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="sg-core" containerID="cri-o://8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4" gracePeriod=30 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.069114 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-notification-agent" containerID="cri-o://a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729" gracePeriod=30 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.069235 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="proxy-httpd" containerID="cri-o://76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597" gracePeriod=30 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.095179 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.216:3000/\": EOF" Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755237 4835 generic.go:334] "Generic (PLEG): container finished" podID="07ebf419-c848-489f-9a01-c601c1c477f7" containerID="76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597" exitCode=0 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755762 4835 generic.go:334] "Generic (PLEG): container finished" podID="07ebf419-c848-489f-9a01-c601c1c477f7" containerID="8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4" exitCode=2 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755836 4835 generic.go:334] "Generic (PLEG): container finished" podID="07ebf419-c848-489f-9a01-c601c1c477f7" containerID="628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635" exitCode=0 Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755315 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerDied","Data":"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597"} Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerDied","Data":"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4"} Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.755921 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerDied","Data":"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635"} Mar 19 09:48:43 crc kubenswrapper[4835]: I0319 09:48:43.758598 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc","Type":"ContainerStarted","Data":"68fb76fa4e06c2d826d44492db5604412b6ed8bafa74a0b0bb5679b189c1d3a3"} Mar 19 09:48:44 crc kubenswrapper[4835]: I0319 09:48:44.221553 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:48:44 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:48:44 crc kubenswrapper[4835]: > Mar 19 09:48:44 crc kubenswrapper[4835]: I0319 09:48:44.772725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc","Type":"ContainerStarted","Data":"a434af4bf6330b601f38d6b4d0c15d3bb6284efce7269fabd22939e635ab84a7"} Mar 19 09:48:44 crc kubenswrapper[4835]: I0319 09:48:44.807642 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.807617576 podStartE2EDuration="3.807617576s" podCreationTimestamp="2026-03-19 09:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:44.793182421 +0000 UTC m=+1579.641781008" watchObservedRunningTime="2026-03-19 09:48:44.807617576 +0000 UTC m=+1579.656216163" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.756369 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.839308 4835 generic.go:334] "Generic (PLEG): container finished" podID="07ebf419-c848-489f-9a01-c601c1c477f7" containerID="a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729" exitCode=0 Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.839414 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.839472 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerDied","Data":"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729"} Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.839511 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"07ebf419-c848-489f-9a01-c601c1c477f7","Type":"ContainerDied","Data":"9f2cf572a2a930b36b442d667f2d103da5b8bf5f1127e5fd2cba0771f3882ef1"} Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.839530 4835 scope.go:117] "RemoveContainer" containerID="76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.931055 4835 scope.go:117] "RemoveContainer" containerID="8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935354 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935485 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6w9n\" (UniqueName: \"kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935539 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935584 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935728 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.935786 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts\") pod \"07ebf419-c848-489f-9a01-c601c1c477f7\" (UID: \"07ebf419-c848-489f-9a01-c601c1c477f7\") " Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.936203 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.936353 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.947925 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts" (OuterVolumeSpecName: "scripts") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:45 crc kubenswrapper[4835]: I0319 09:48:45.961007 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n" (OuterVolumeSpecName: "kube-api-access-h6w9n") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "kube-api-access-h6w9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.040557 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.040593 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.040602 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/07ebf419-c848-489f-9a01-c601c1c477f7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.040613 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6w9n\" (UniqueName: \"kubernetes.io/projected/07ebf419-c848-489f-9a01-c601c1c477f7-kube-api-access-h6w9n\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.050888 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.104729 4835 scope.go:117] "RemoveContainer" containerID="a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.118967 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data" (OuterVolumeSpecName: "config-data") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.143027 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.143080 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.149827 4835 scope.go:117] "RemoveContainer" containerID="628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.163048 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ebf419-c848-489f-9a01-c601c1c477f7" (UID: "07ebf419-c848-489f-9a01-c601c1c477f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.174443 4835 scope.go:117] "RemoveContainer" containerID="76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.176974 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597\": container with ID starting with 76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597 not found: ID does not exist" containerID="76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.177019 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597"} err="failed to get container status \"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597\": rpc error: code = NotFound desc = could not find container \"76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597\": container with ID starting with 76f533719e9001d24ce795d3d831709204f9acbfc7549959c7e00c841773c597 not found: ID does not exist" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.177055 4835 scope.go:117] "RemoveContainer" containerID="8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.177643 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4\": container with ID starting with 8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4 not found: ID does not exist" containerID="8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.177673 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4"} err="failed to get container status \"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4\": rpc error: code = NotFound desc = could not find container \"8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4\": container with ID starting with 8dc8b89ec0fcc49d37428707e46224d86a7aa8c2fcaf6784ff82dcc0cb91d8c4 not found: ID does not exist" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.177691 4835 scope.go:117] "RemoveContainer" containerID="a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.177972 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729\": container with ID starting with a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729 not found: ID does not exist" containerID="a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.178005 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729"} err="failed to get container status \"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729\": rpc error: code = NotFound desc = could not find container \"a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729\": container with ID starting with a76a6605b4db84470118e8550bc4c8fc33fe8305f5158fd08147bbf2a29cd729 not found: ID does not exist" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.178018 4835 scope.go:117] "RemoveContainer" containerID="628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.178290 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635\": container with ID starting with 628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635 not found: ID does not exist" containerID="628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.178315 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635"} err="failed to get container status \"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635\": rpc error: code = NotFound desc = could not find container \"628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635\": container with ID starting with 628321c6eb0e02c3aa64600eae70dcfe1e19ecd140b7dbc25b47f586c1d18635 not found: ID does not exist" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.245444 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ebf419-c848-489f-9a01-c601c1c477f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.478112 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.493079 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.521847 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.528517 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="sg-core" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.528555 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="sg-core" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.529116 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-notification-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.529131 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-notification-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.529281 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-central-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.529941 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-central-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.529990 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="proxy-httpd" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.529999 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="proxy-httpd" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.531642 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="proxy-httpd" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.531673 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-central-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.531697 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="ceilometer-notification-agent" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.531718 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" containerName="sg-core" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.552479 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.552602 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.557713 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.558227 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.658806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.658980 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.659030 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.659122 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwlln\" (UniqueName: \"kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.659165 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.659324 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.659470 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.731881 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 09:48:46 crc kubenswrapper[4835]: E0319 09:48:46.758653 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ebf419_c848_489f_9a01_c601c1c477f7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ebf419_c848_489f_9a01_c601c1c477f7.slice/crio-9f2cf572a2a930b36b442d667f2d103da5b8bf5f1127e5fd2cba0771f3882ef1\": RecentStats: unable to find data in memory cache]" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.760987 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761102 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761187 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761216 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761256 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwlln\" (UniqueName: \"kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761274 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.761747 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.763246 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.767731 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.767930 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.781502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.792398 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwlln\" (UniqueName: \"kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.793803 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts\") pod \"ceilometer-0\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " pod="openstack/ceilometer-0" Mar 19 09:48:46 crc kubenswrapper[4835]: I0319 09:48:46.878265 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.034988 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.036727 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.041264 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.041368 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.046226 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-4xtcm" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.075196 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.170729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.172723 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.173037 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.173083 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q5g\" (UniqueName: \"kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.184924 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.187505 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.195981 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.275027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.275087 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q5g\" (UniqueName: \"kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.275163 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.275216 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.289747 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.291289 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.299190 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.306956 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.307259 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.310299 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q5g\" (UniqueName: \"kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.320820 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.328826 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle\") pod \"heat-engine-6ffb669df8-qcq6h\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.338129 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.339512 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.341152 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.353259 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.380672 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381499 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r8j8\" (UniqueName: \"kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381543 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381572 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381593 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.381615 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.486800 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.486884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.486936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r8j8\" (UniqueName: \"kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.486985 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487002 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmr6v\" (UniqueName: \"kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487021 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487127 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487151 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487181 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487216 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487247 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.487306 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97pb\" (UniqueName: \"kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.489697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.491483 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.492660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.492666 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.493001 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.507477 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r8j8\" (UniqueName: \"kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8\") pod \"dnsmasq-dns-7756b9d78c-9b8ks\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.530106 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.590874 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.591209 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmr6v\" (UniqueName: \"kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.591331 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.591362 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.591376 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.600083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97pb\" (UniqueName: \"kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.600257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.600414 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.613368 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.613614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.614182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.617632 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.618048 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmr6v\" (UniqueName: \"kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.633796 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97pb\" (UniqueName: \"kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb\") pod \"heat-api-66b45b64d-bv78b\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.639216 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.667312 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle\") pod \"heat-cfnapi-57559f6c85-kg92b\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.678138 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.717702 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.875945 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:48:47 crc kubenswrapper[4835]: I0319 09:48:47.900663 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerStarted","Data":"1b6a8863e1b335f06af5b7ab55d3ceb610941866e78a11acc6bae5ce7f968011"} Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.089952 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.363423 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.438348 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ebf419-c848-489f-9a01-c601c1c477f7" path="/var/lib/kubelet/pods/07ebf419-c848-489f-9a01-c601c1c477f7/volumes" Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.587171 4835 scope.go:117] "RemoveContainer" containerID="2e4dbacc1b3de03cda9ae2dac8484d6d3e2da91b2b19ea75d9f2dfb06e36203e" Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.814265 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.927542 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-66b45b64d-bv78b" event={"ID":"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a","Type":"ContainerStarted","Data":"3ff477b46ea1fec286f64f187acb58d4a26b34ed602327bfe7a8ded6d8bda9f9"} Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.933126 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" event={"ID":"cd860306-3e91-4d1d-ae71-cb67e787d22a","Type":"ContainerStarted","Data":"21781bc77fb5fba98c89dd052659f5fe5fd3dee4355c07044538e34155320acc"} Mar 19 09:48:48 crc kubenswrapper[4835]: I0319 09:48:48.939952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ffb669df8-qcq6h" event={"ID":"1342aee7-66dc-4df2-b44a-970b332ffc8d","Type":"ContainerStarted","Data":"38534877503fd88b2a46dea46dd31fb7133bf40e642174cb24bfe770fdcbd001"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.162795 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.641225 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d99fb9659-b272b" Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.740175 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.740693 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b7bcd9fc8-zjbvb" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-api" containerID="cri-o://d6970c892339bf0566200f3dc9127a92fedb590a2694385e9d175311825617b0" gracePeriod=30 Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.741004 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b7bcd9fc8-zjbvb" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-httpd" containerID="cri-o://78a3c42296f55f6367bb88e0e1ed4ef75f39fcbe1fbacc010f3b939394f468ea" gracePeriod=30 Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.958366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57559f6c85-kg92b" event={"ID":"00ce99fa-0f33-465a-9610-1a935668ec05","Type":"ContainerStarted","Data":"8e14061cfb061ed2020afcfde1e898867f7cd35d714e5fb78aae8e030945dedc"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.964073 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerID="78a3c42296f55f6367bb88e0e1ed4ef75f39fcbe1fbacc010f3b939394f468ea" exitCode=0 Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.964165 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerDied","Data":"78a3c42296f55f6367bb88e0e1ed4ef75f39fcbe1fbacc010f3b939394f468ea"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.965914 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerStarted","Data":"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.968180 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ffb669df8-qcq6h" event={"ID":"1342aee7-66dc-4df2-b44a-970b332ffc8d","Type":"ContainerStarted","Data":"a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.968276 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.972972 4835 generic.go:334] "Generic (PLEG): container finished" podID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerID="c0ea60bf11bf096f46163378aa34625e8cc0835312390359536ccc50967e573c" exitCode=0 Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.973007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" event={"ID":"cd860306-3e91-4d1d-ae71-cb67e787d22a","Type":"ContainerDied","Data":"c0ea60bf11bf096f46163378aa34625e8cc0835312390359536ccc50967e573c"} Mar 19 09:48:49 crc kubenswrapper[4835]: I0319 09:48:49.989040 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6ffb669df8-qcq6h" podStartSLOduration=3.989022124 podStartE2EDuration="3.989022124s" podCreationTimestamp="2026-03-19 09:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:49.988707246 +0000 UTC m=+1584.837305833" watchObservedRunningTime="2026-03-19 09:48:49.989022124 +0000 UTC m=+1584.837620711" Mar 19 09:48:50 crc kubenswrapper[4835]: I0319 09:48:50.993639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" event={"ID":"cd860306-3e91-4d1d-ae71-cb67e787d22a","Type":"ContainerStarted","Data":"a5a235769b9aae535cffdbfea40679b6c14b38ffc77ae5005851256d1d79503e"} Mar 19 09:48:50 crc kubenswrapper[4835]: I0319 09:48:50.994318 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:51 crc kubenswrapper[4835]: I0319 09:48:51.000277 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerStarted","Data":"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1"} Mar 19 09:48:51 crc kubenswrapper[4835]: I0319 09:48:51.016262 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" podStartSLOduration=4.01623035 podStartE2EDuration="4.01623035s" podCreationTimestamp="2026-03-19 09:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:48:51.013487477 +0000 UTC m=+1585.862086064" watchObservedRunningTime="2026-03-19 09:48:51.01623035 +0000 UTC m=+1585.864828937" Mar 19 09:48:51 crc kubenswrapper[4835]: I0319 09:48:51.242650 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:51 crc kubenswrapper[4835]: I0319 09:48:51.248528 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5f95f9444c-phdqp" Mar 19 09:48:52 crc kubenswrapper[4835]: I0319 09:48:52.052410 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 09:48:52 crc kubenswrapper[4835]: I0319 09:48:52.292440 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.044154 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerID="d6970c892339bf0566200f3dc9127a92fedb590a2694385e9d175311825617b0" exitCode=0 Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.044658 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerDied","Data":"d6970c892339bf0566200f3dc9127a92fedb590a2694385e9d175311825617b0"} Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.836907 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.838800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.847825 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.849370 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.879513 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.881082 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.938975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939280 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939300 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939320 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thbl9\" (UniqueName: \"kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939381 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrpf8\" (UniqueName: \"kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939404 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939453 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939501 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939529 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.939596 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfvn\" (UniqueName: \"kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:53 crc kubenswrapper[4835]: I0319 09:48:53.972592 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.038471 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.051811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.051883 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.051910 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thbl9\" (UniqueName: \"kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052070 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrpf8\" (UniqueName: \"kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052126 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052268 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052379 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052427 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfvn\" (UniqueName: \"kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.052504 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.078860 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.080878 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.081519 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.082918 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.083032 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.085650 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.090305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.091853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.093632 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.094652 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thbl9\" (UniqueName: \"kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9\") pod \"heat-engine-956b854d8-kqpb6\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.102945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.134984 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfvn\" (UniqueName: \"kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn\") pod \"heat-api-5757d74498-tjqm4\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.137891 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrpf8\" (UniqueName: \"kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8\") pod \"heat-cfnapi-74f47f4bf4-6gctg\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.173445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.188292 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.220290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:48:54 crc kubenswrapper[4835]: I0319 09:48:54.241251 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:48:54 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:48:54 crc kubenswrapper[4835]: > Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.612896 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.624567 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.648567 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": dial tcp 10.217.0.215:8776: connect: connection refused" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.657530 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.659179 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.668081 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.668255 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.690030 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.691667 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.697607 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.709379 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.718364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.734912 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.734952 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.734973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.735006 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8ml\" (UniqueName: \"kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.735063 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.735128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.778804 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854244 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854307 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854339 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854386 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854411 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854428 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8ml\" (UniqueName: \"kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwq79\" (UniqueName: \"kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854538 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854567 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854623 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.854654 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.875195 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.875777 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.882407 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.885968 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.890271 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.914605 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8ml\" (UniqueName: \"kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml\") pod \"heat-cfnapi-69459bdfc9-h9kps\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958197 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwq79\" (UniqueName: \"kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958620 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.958647 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.972126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.972885 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.974558 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.974579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:56 crc kubenswrapper[4835]: I0319 09:48:56.995673 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.011455 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.023705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwq79\" (UniqueName: \"kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79\") pod \"heat-api-755fb9d4bd-8w5k4\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.051151 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.144987 4835 generic.go:334] "Generic (PLEG): container finished" podID="46107793-2582-4626-bf6e-c3e990e07ee4" containerID="8cf685ea5da00d9509cf3f90a909cc8bfd06aac59bd4c388f84d3926a255f510" exitCode=137 Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.145032 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerDied","Data":"8cf685ea5da00d9509cf3f90a909cc8bfd06aac59bd4c388f84d3926a255f510"} Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.531437 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.600783 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:48:57 crc kubenswrapper[4835]: I0319 09:48:57.601394 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="dnsmasq-dns" containerID="cri-o://c4f71fcf5f1297660aeb3e4dd2aaf4bc8dd2376fe215c7df2e5be800dc5ca8c8" gracePeriod=10 Mar 19 09:48:58 crc kubenswrapper[4835]: I0319 09:48:58.160511 4835 generic.go:334] "Generic (PLEG): container finished" podID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerID="c4f71fcf5f1297660aeb3e4dd2aaf4bc8dd2376fe215c7df2e5be800dc5ca8c8" exitCode=0 Mar 19 09:48:58 crc kubenswrapper[4835]: I0319 09:48:58.160570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" event={"ID":"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce","Type":"ContainerDied","Data":"c4f71fcf5f1297660aeb3e4dd2aaf4bc8dd2376fe215c7df2e5be800dc5ca8c8"} Mar 19 09:49:01 crc kubenswrapper[4835]: E0319 09:49:01.462566 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Mar 19 09:49:01 crc kubenswrapper[4835]: E0319 09:49:01.463023 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n78hdch6bh58dhfbh677hcbh55fh545h94h5dfh644h696hd5h5fdh5ddh644h574h554h56h7h569h546h5bbh697h5b7h5dch5ffh9h58dh565h687q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f69jz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(636dd5f8-5cc2-46f4-84ac-a094b6881a4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:49:01 crc kubenswrapper[4835]: E0319 09:49:01.464182 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="636dd5f8-5cc2-46f4-84ac-a094b6881a4f" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.477075 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.593275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle\") pod \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.593345 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config\") pod \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.593573 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h98vw\" (UniqueName: \"kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw\") pod \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.593635 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs\") pod \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.593657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config\") pod \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\" (UID: \"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8\") " Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.622999 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" (UID: "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.631915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw" (OuterVolumeSpecName: "kube-api-access-h98vw") pod "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" (UID: "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8"). InnerVolumeSpecName "kube-api-access-h98vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.630070 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.214:5353: connect: connection refused" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.696286 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h98vw\" (UniqueName: \"kubernetes.io/projected/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-kube-api-access-h98vw\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.696323 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.742694 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" (UID: "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.746454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" (UID: "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.799427 4835 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.799456 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.815385 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config" (OuterVolumeSpecName: "config") pod "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" (UID: "e3f9f7c2-7672-4af7-8857-76a91f8c8ce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.911229 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:01 crc kubenswrapper[4835]: I0319 09:49:01.940111 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012440 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012558 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012583 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012598 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvvz4\" (UniqueName: \"kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012634 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.012828 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs\") pod \"46107793-2582-4626-bf6e-c3e990e07ee4\" (UID: \"46107793-2582-4626-bf6e-c3e990e07ee4\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.013810 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs" (OuterVolumeSpecName: "logs") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.013845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.030342 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts" (OuterVolumeSpecName: "scripts") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.032869 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4" (OuterVolumeSpecName: "kube-api-access-nvvz4") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "kube-api-access-nvvz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.033818 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.110972 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115894 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46107793-2582-4626-bf6e-c3e990e07ee4-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115921 4835 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46107793-2582-4626-bf6e-c3e990e07ee4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115932 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115941 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115950 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvvz4\" (UniqueName: \"kubernetes.io/projected/46107793-2582-4626-bf6e-c3e990e07ee4-kube-api-access-nvvz4\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.115959 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.209985 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.250482 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.251357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46107793-2582-4626-bf6e-c3e990e07ee4","Type":"ContainerDied","Data":"9b8794d9a228a998a09b1331c85aab45494f29dc9474a3b28ac602997956de22"} Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.251397 4835 scope.go:117] "RemoveContainer" containerID="8cf685ea5da00d9509cf3f90a909cc8bfd06aac59bd4c388f84d3926a255f510" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.257080 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" event={"ID":"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce","Type":"ContainerDied","Data":"e33802d1a08a5e3c12b0ca4e040a3cea987dc1f9226d2dd4d8e91d008ae991a7"} Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.257167 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-jc254" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.264020 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b7bcd9fc8-zjbvb" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.264908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b7bcd9fc8-zjbvb" event={"ID":"e3f9f7c2-7672-4af7-8857-76a91f8c8ce8","Type":"ContainerDied","Data":"18f6a4bb6986edb573efaa6bfd395b3483d4362efb98960130b22243215b18f6"} Mar 19 09:49:02 crc kubenswrapper[4835]: E0319 09:49:02.265782 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="636dd5f8-5cc2-46f4-84ac-a094b6881a4f" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.284727 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data" (OuterVolumeSpecName: "config-data") pod "46107793-2582-4626-bf6e-c3e990e07ee4" (UID: "46107793-2582-4626-bf6e-c3e990e07ee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.305193 4835 scope.go:117] "RemoveContainer" containerID="ab60b8fb20355180de32580a4d58699478d0b3b1ddf393c22973d0393f3ae459" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.321770 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjnfl\" (UniqueName: \"kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324631 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324749 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324822 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.324916 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0\") pod \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\" (UID: \"0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce\") " Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.325472 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46107793-2582-4626-bf6e-c3e990e07ee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.332127 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b7bcd9fc8-zjbvb"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.332912 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl" (OuterVolumeSpecName: "kube-api-access-kjnfl") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "kube-api-access-kjnfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.398384 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.425406 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.440138 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" path="/var/lib/kubelet/pods/e3f9f7c2-7672-4af7-8857-76a91f8c8ce8/volumes" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.442152 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjnfl\" (UniqueName: \"kubernetes.io/projected/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-kube-api-access-kjnfl\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.442173 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.442181 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.445318 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.464170 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config" (OuterVolumeSpecName: "config") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.487390 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" (UID: "0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.544626 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.544653 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.544663 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.624656 4835 scope.go:117] "RemoveContainer" containerID="c4f71fcf5f1297660aeb3e4dd2aaf4bc8dd2376fe215c7df2e5be800dc5ca8c8" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.683996 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.713690 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-jc254"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.718317 4835 scope.go:117] "RemoveContainer" containerID="de4348dd2fff7fd16cfb75ef1e4aa8b57dda45252f7dcaef86a6a6d7f36becbc" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.732255 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.813601 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.857408 4835 scope.go:117] "RemoveContainer" containerID="78a3c42296f55f6367bb88e0e1ed4ef75f39fcbe1fbacc010f3b939394f468ea" Mar 19 09:49:02 crc kubenswrapper[4835]: I0319 09:49:02.953399 4835 scope.go:117] "RemoveContainer" containerID="d6970c892339bf0566200f3dc9127a92fedb590a2694385e9d175311825617b0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.042409 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044161 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-httpd" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044181 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-httpd" Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044194 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-api" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044200 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-api" Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044213 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="init" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044219 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="init" Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044243 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044249 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044273 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api-log" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044279 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api-log" Mar 19 09:49:03 crc kubenswrapper[4835]: E0319 09:49:03.044298 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="dnsmasq-dns" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044304 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="dnsmasq-dns" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044551 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044581 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api-log" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044594 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-httpd" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044603 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f9f7c2-7672-4af7-8857-76a91f8c8ce8" containerName="neutron-api" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.044612 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" containerName="dnsmasq-dns" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.046632 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.050158 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.051132 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.056095 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.076292 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.121074 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.131260 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.142721 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151102 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvp44\" (UniqueName: \"kubernetes.io/projected/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-kube-api-access-fvp44\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151244 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-logs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151282 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-scripts\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151319 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151430 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151502 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.151625 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.157973 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.172369 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.253977 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254232 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-logs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254371 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-scripts\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254671 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254811 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.254933 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.255053 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.255165 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvp44\" (UniqueName: \"kubernetes.io/projected/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-kube-api-access-fvp44\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.267850 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.274447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.278777 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-logs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.281096 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.317073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" event={"ID":"02257820-75fc-4616-82d0-5c1186508556","Type":"ContainerStarted","Data":"edc593b705ed4b7c8d270efe146bb580995522d3824a7402b39788e11446c3c8"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.318432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-config-data-custom\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.320177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.335325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerStarted","Data":"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.337397 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" event={"ID":"4384c6fc-0c55-44d3-98b9-b373a66050ee","Type":"ContainerStarted","Data":"75830308695b4afcc43dbea56b559fd521b3ced17b9087bb0f2095a4e63d74a6"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.338789 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvp44\" (UniqueName: \"kubernetes.io/projected/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-kube-api-access-fvp44\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.343475 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-scripts\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.350991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-66b45b64d-bv78b" event={"ID":"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a","Type":"ContainerStarted","Data":"c63cfdcf69da9c499a64a979ba0a625be5450bc89b0742890cbcb982fa9c3f63"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.351126 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-66b45b64d-bv78b" podUID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" containerName="heat-api" containerID="cri-o://c63cfdcf69da9c499a64a979ba0a625be5450bc89b0742890cbcb982fa9c3f63" gracePeriod=60 Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.351340 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.357621 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ca55c1-1697-4203-bcec-3a9e5bd64c59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d3ca55c1-1697-4203-bcec-3a9e5bd64c59\") " pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.369008 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5757d74498-tjqm4" event={"ID":"3d139802-4569-4073-b80c-4807478db41a","Type":"ContainerStarted","Data":"1789009b0c01035fcf09fd2ac5e8649f7318735c7d5e447d25bc67a65f952fa8"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.403963 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.436364 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-956b854d8-kqpb6" event={"ID":"be54447e-7aac-4db8-94fc-f10aa4661921","Type":"ContainerStarted","Data":"987539963d6ed4a145e4dae3341910ef0609e1b928191d75c1252637409459f4"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.456589 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57559f6c85-kg92b" event={"ID":"00ce99fa-0f33-465a-9610-1a935668ec05","Type":"ContainerStarted","Data":"a72e732a975c0c81bd7abf6d3e10f3bb6a0f316c5c4dc0fc3d1c12296b85a263"} Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.456786 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-57559f6c85-kg92b" podUID="00ce99fa-0f33-465a-9610-1a935668ec05" containerName="heat-cfnapi" containerID="cri-o://a72e732a975c0c81bd7abf6d3e10f3bb6a0f316c5c4dc0fc3d1c12296b85a263" gracePeriod=60 Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.457084 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.484314 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-66b45b64d-bv78b" podStartSLOduration=3.583252073 podStartE2EDuration="16.484177927s" podCreationTimestamp="2026-03-19 09:48:47 +0000 UTC" firstStartedPulling="2026-03-19 09:48:48.868899234 +0000 UTC m=+1583.717497821" lastFinishedPulling="2026-03-19 09:49:01.769825088 +0000 UTC m=+1596.618423675" observedRunningTime="2026-03-19 09:49:03.370695346 +0000 UTC m=+1598.219293943" watchObservedRunningTime="2026-03-19 09:49:03.484177927 +0000 UTC m=+1598.332776504" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.486874 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-57559f6c85-kg92b" podStartSLOduration=4.210963424 podStartE2EDuration="16.486865389s" podCreationTimestamp="2026-03-19 09:48:47 +0000 UTC" firstStartedPulling="2026-03-19 09:48:49.490902023 +0000 UTC m=+1584.339500610" lastFinishedPulling="2026-03-19 09:49:01.766803988 +0000 UTC m=+1596.615402575" observedRunningTime="2026-03-19 09:49:03.48014377 +0000 UTC m=+1598.328742357" watchObservedRunningTime="2026-03-19 09:49:03.486865389 +0000 UTC m=+1598.335463996" Mar 19 09:49:03 crc kubenswrapper[4835]: I0319 09:49:03.503128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755fb9d4bd-8w5k4" event={"ID":"e53a7466-d485-484b-aed3-0ce3b20b27ce","Type":"ContainerStarted","Data":"db80ea38094234fd345a093ee534a843c12225e627eb1b73270f32d94a45db29"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.166012 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 09:49:04 crc kubenswrapper[4835]: W0319 09:49:04.177183 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ca55c1_1697_4203_bcec_3a9e5bd64c59.slice/crio-38ecd56dca1ae25bcb8593dfd670a426653968cdcab5cbb21ca26ed523e08bd6 WatchSource:0}: Error finding container 38ecd56dca1ae25bcb8593dfd670a426653968cdcab5cbb21ca26ed523e08bd6: Status 404 returned error can't find the container with id 38ecd56dca1ae25bcb8593dfd670a426653968cdcab5cbb21ca26ed523e08bd6 Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.206639 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:49:04 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:49:04 crc kubenswrapper[4835]: > Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.435622 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce" path="/var/lib/kubelet/pods/0a9665c4-9a2a-4bac-83d4-d0dfa748f4ce/volumes" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.436397 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" path="/var/lib/kubelet/pods/46107793-2582-4626-bf6e-c3e990e07ee4/volumes" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.557280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-956b854d8-kqpb6" event={"ID":"be54447e-7aac-4db8-94fc-f10aa4661921","Type":"ContainerStarted","Data":"3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.557861 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.566295 4835 generic.go:334] "Generic (PLEG): container finished" podID="00ce99fa-0f33-465a-9610-1a935668ec05" containerID="a72e732a975c0c81bd7abf6d3e10f3bb6a0f316c5c4dc0fc3d1c12296b85a263" exitCode=0 Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.566396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57559f6c85-kg92b" event={"ID":"00ce99fa-0f33-465a-9610-1a935668ec05","Type":"ContainerDied","Data":"a72e732a975c0c81bd7abf6d3e10f3bb6a0f316c5c4dc0fc3d1c12296b85a263"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.583262 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-956b854d8-kqpb6" podStartSLOduration=11.583241987 podStartE2EDuration="11.583241987s" podCreationTimestamp="2026-03-19 09:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:04.579824255 +0000 UTC m=+1599.428422832" watchObservedRunningTime="2026-03-19 09:49:04.583241987 +0000 UTC m=+1599.431840574" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.583996 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" event={"ID":"4384c6fc-0c55-44d3-98b9-b373a66050ee","Type":"ContainerStarted","Data":"71d535ca63306003f9de842e9e7ab4f21245d9aa58264e239d5e78e59c9d9900"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.584957 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.594184 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3ca55c1-1697-4203-bcec-3a9e5bd64c59","Type":"ContainerStarted","Data":"38ecd56dca1ae25bcb8593dfd670a426653968cdcab5cbb21ca26ed523e08bd6"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.603050 4835 generic.go:334] "Generic (PLEG): container finished" podID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" containerID="c63cfdcf69da9c499a64a979ba0a625be5450bc89b0742890cbcb982fa9c3f63" exitCode=0 Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.603105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-66b45b64d-bv78b" event={"ID":"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a","Type":"ContainerDied","Data":"c63cfdcf69da9c499a64a979ba0a625be5450bc89b0742890cbcb982fa9c3f63"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.604187 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755fb9d4bd-8w5k4" event={"ID":"e53a7466-d485-484b-aed3-0ce3b20b27ce","Type":"ContainerStarted","Data":"ec46e09d6987c2a1cfe64f2d24b39aab1fe1eb7df289d317198d33e7835d6c82"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.605244 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.623110 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d139802-4569-4073-b80c-4807478db41a" containerID="f2cfc66806b4a735c350a4033b70edfe3abef8b07ba8e4a89661a826d9f445ad" exitCode=1 Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.623200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5757d74498-tjqm4" event={"ID":"3d139802-4569-4073-b80c-4807478db41a","Type":"ContainerDied","Data":"f2cfc66806b4a735c350a4033b70edfe3abef8b07ba8e4a89661a826d9f445ad"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.623639 4835 scope.go:117] "RemoveContainer" containerID="f2cfc66806b4a735c350a4033b70edfe3abef8b07ba8e4a89661a826d9f445ad" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.649483 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" podStartSLOduration=8.649466529 podStartE2EDuration="8.649466529s" podCreationTimestamp="2026-03-19 09:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:04.614683933 +0000 UTC m=+1599.463282520" watchObservedRunningTime="2026-03-19 09:49:04.649466529 +0000 UTC m=+1599.498065116" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.658080 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-755fb9d4bd-8w5k4" podStartSLOduration=8.658060198 podStartE2EDuration="8.658060198s" podCreationTimestamp="2026-03-19 09:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:04.64612047 +0000 UTC m=+1599.494719057" watchObservedRunningTime="2026-03-19 09:49:04.658060198 +0000 UTC m=+1599.506658785" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.659311 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" event={"ID":"02257820-75fc-4616-82d0-5c1186508556","Type":"ContainerStarted","Data":"ade39813b6c68de2cd278d847ee83012778c78b6fd6412e64673db5ebcb40669"} Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.665257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:49:04 crc kubenswrapper[4835]: I0319 09:49:04.730110 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" podStartSLOduration=11.730086415 podStartE2EDuration="11.730086415s" podCreationTimestamp="2026-03-19 09:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:04.706385105 +0000 UTC m=+1599.554983692" watchObservedRunningTime="2026-03-19 09:49:04.730086415 +0000 UTC m=+1599.578685002" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.187518 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.199155 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.363883 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data\") pod \"00ce99fa-0f33-465a-9610-1a935668ec05\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.363949 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmr6v\" (UniqueName: \"kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v\") pod \"00ce99fa-0f33-465a-9610-1a935668ec05\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364085 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom\") pod \"00ce99fa-0f33-465a-9610-1a935668ec05\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364133 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom\") pod \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364159 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle\") pod \"00ce99fa-0f33-465a-9610-1a935668ec05\" (UID: \"00ce99fa-0f33-465a-9610-1a935668ec05\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data\") pod \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364306 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m97pb\" (UniqueName: \"kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb\") pod \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.364437 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle\") pod \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\" (UID: \"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a\") " Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.373130 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00ce99fa-0f33-465a-9610-1a935668ec05" (UID: "00ce99fa-0f33-465a-9610-1a935668ec05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.373136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" (UID: "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.376964 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v" (OuterVolumeSpecName: "kube-api-access-cmr6v") pod "00ce99fa-0f33-465a-9610-1a935668ec05" (UID: "00ce99fa-0f33-465a-9610-1a935668ec05"). InnerVolumeSpecName "kube-api-access-cmr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.377051 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb" (OuterVolumeSpecName: "kube-api-access-m97pb") pod "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" (UID: "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a"). InnerVolumeSpecName "kube-api-access-m97pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.439154 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00ce99fa-0f33-465a-9610-1a935668ec05" (UID: "00ce99fa-0f33-465a-9610-1a935668ec05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.468115 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.468160 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.468174 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.468185 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m97pb\" (UniqueName: \"kubernetes.io/projected/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-kube-api-access-m97pb\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.468201 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmr6v\" (UniqueName: \"kubernetes.io/projected/00ce99fa-0f33-465a-9610-1a935668ec05-kube-api-access-cmr6v\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.471470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" (UID: "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.479163 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data" (OuterVolumeSpecName: "config-data") pod "00ce99fa-0f33-465a-9610-1a935668ec05" (UID: "00ce99fa-0f33-465a-9610-1a935668ec05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.499436 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data" (OuterVolumeSpecName: "config-data") pod "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" (UID: "c5ee8b70-a6eb-4d4e-8615-4f122543ba5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.570843 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.571055 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00ce99fa-0f33-465a-9610-1a935668ec05-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.571154 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.696897 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-57559f6c85-kg92b" event={"ID":"00ce99fa-0f33-465a-9610-1a935668ec05","Type":"ContainerDied","Data":"8e14061cfb061ed2020afcfde1e898867f7cd35d714e5fb78aae8e030945dedc"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.697211 4835 scope.go:117] "RemoveContainer" containerID="a72e732a975c0c81bd7abf6d3e10f3bb6a0f316c5c4dc0fc3d1c12296b85a263" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.697178 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-57559f6c85-kg92b" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.709924 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerStarted","Data":"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.710304 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.710048 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="proxy-httpd" containerID="cri-o://8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b" gracePeriod=30 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.710064 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="sg-core" containerID="cri-o://045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105" gracePeriod=30 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.710080 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-notification-agent" containerID="cri-o://dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1" gracePeriod=30 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.710030 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-central-agent" containerID="cri-o://b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c" gracePeriod=30 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.726990 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3ca55c1-1697-4203-bcec-3a9e5bd64c59","Type":"ContainerStarted","Data":"5bfaa43e8d43986b12db2edbea2dea72ccf0b39124d270b4c568e4e047ad2f6b"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.742701 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-66b45b64d-bv78b" event={"ID":"c5ee8b70-a6eb-4d4e-8615-4f122543ba5a","Type":"ContainerDied","Data":"3ff477b46ea1fec286f64f187acb58d4a26b34ed602327bfe7a8ded6d8bda9f9"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.742909 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-66b45b64d-bv78b" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.748960 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.137747444 podStartE2EDuration="19.748941061s" podCreationTimestamp="2026-03-19 09:48:46 +0000 UTC" firstStartedPulling="2026-03-19 09:48:47.713187798 +0000 UTC m=+1582.561786385" lastFinishedPulling="2026-03-19 09:49:04.324381415 +0000 UTC m=+1599.172980002" observedRunningTime="2026-03-19 09:49:05.731838175 +0000 UTC m=+1600.580436772" watchObservedRunningTime="2026-03-19 09:49:05.748941061 +0000 UTC m=+1600.597539648" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.756172 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d139802-4569-4073-b80c-4807478db41a" containerID="78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636" exitCode=1 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.756234 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5757d74498-tjqm4" event={"ID":"3d139802-4569-4073-b80c-4807478db41a","Type":"ContainerDied","Data":"78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.757124 4835 scope.go:117] "RemoveContainer" containerID="c63cfdcf69da9c499a64a979ba0a625be5450bc89b0742890cbcb982fa9c3f63" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.757214 4835 scope.go:117] "RemoveContainer" containerID="78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636" Mar 19 09:49:05 crc kubenswrapper[4835]: E0319 09:49:05.757984 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5757d74498-tjqm4_openstack(3d139802-4569-4073-b80c-4807478db41a)\"" pod="openstack/heat-api-5757d74498-tjqm4" podUID="3d139802-4569-4073-b80c-4807478db41a" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.764273 4835 generic.go:334] "Generic (PLEG): container finished" podID="02257820-75fc-4616-82d0-5c1186508556" containerID="ade39813b6c68de2cd278d847ee83012778c78b6fd6412e64673db5ebcb40669" exitCode=1 Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.766205 4835 scope.go:117] "RemoveContainer" containerID="ade39813b6c68de2cd278d847ee83012778c78b6fd6412e64673db5ebcb40669" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.766684 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" event={"ID":"02257820-75fc-4616-82d0-5c1186508556","Type":"ContainerDied","Data":"ade39813b6c68de2cd278d847ee83012778c78b6fd6412e64673db5ebcb40669"} Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.777695 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.799415 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-57559f6c85-kg92b"] Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.828933 4835 scope.go:117] "RemoveContainer" containerID="f2cfc66806b4a735c350a4033b70edfe3abef8b07ba8e4a89661a826d9f445ad" Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.834028 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:49:05 crc kubenswrapper[4835]: I0319 09:49:05.847073 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-66b45b64d-bv78b"] Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.418435 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ce99fa-0f33-465a-9610-1a935668ec05" path="/var/lib/kubelet/pods/00ce99fa-0f33-465a-9610-1a935668ec05/volumes" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.419456 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" path="/var/lib/kubelet/pods/c5ee8b70-a6eb-4d4e-8615-4f122543ba5a/volumes" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.422897 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.422935 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.422971 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.423658 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.423706 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" gracePeriod=600 Mar 19 09:49:06 crc kubenswrapper[4835]: E0319 09:49:06.581165 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.648956 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="46107793-2582-4626-bf6e-c3e990e07ee4" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.215:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.777586 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d3ca55c1-1697-4203-bcec-3a9e5bd64c59","Type":"ContainerStarted","Data":"a1b15af6fdaff5ace7d86cea9415f2bfff0cb97eb0d2bb2a56f040642d680077"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.778997 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.783707 4835 scope.go:117] "RemoveContainer" containerID="78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636" Mar 19 09:49:06 crc kubenswrapper[4835]: E0319 09:49:06.784102 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5757d74498-tjqm4_openstack(3d139802-4569-4073-b80c-4807478db41a)\"" pod="openstack/heat-api-5757d74498-tjqm4" podUID="3d139802-4569-4073-b80c-4807478db41a" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.785980 4835 generic.go:334] "Generic (PLEG): container finished" podID="02257820-75fc-4616-82d0-5c1186508556" containerID="4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844" exitCode=1 Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.786017 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" event={"ID":"02257820-75fc-4616-82d0-5c1186508556","Type":"ContainerDied","Data":"4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.786072 4835 scope.go:117] "RemoveContainer" containerID="ade39813b6c68de2cd278d847ee83012778c78b6fd6412e64673db5ebcb40669" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.786700 4835 scope.go:117] "RemoveContainer" containerID="4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844" Mar 19 09:49:06 crc kubenswrapper[4835]: E0319 09:49:06.786995 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74f47f4bf4-6gctg_openstack(02257820-75fc-4616-82d0-5c1186508556)\"" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" podUID="02257820-75fc-4616-82d0-5c1186508556" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.799876 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" exitCode=0 Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.799913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.800796 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:49:06 crc kubenswrapper[4835]: E0319 09:49:06.801115 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803179 4835 generic.go:334] "Generic (PLEG): container finished" podID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerID="8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b" exitCode=0 Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803199 4835 generic.go:334] "Generic (PLEG): container finished" podID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerID="045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105" exitCode=2 Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803206 4835 generic.go:334] "Generic (PLEG): container finished" podID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerID="b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c" exitCode=0 Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803908 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerDied","Data":"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803940 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerDied","Data":"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.803952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerDied","Data":"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c"} Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.816085 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.816064579 podStartE2EDuration="4.816064579s" podCreationTimestamp="2026-03-19 09:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:06.806284899 +0000 UTC m=+1601.654883496" watchObservedRunningTime="2026-03-19 09:49:06.816064579 +0000 UTC m=+1601.664663166" Mar 19 09:49:06 crc kubenswrapper[4835]: I0319 09:49:06.866682 4835 scope.go:117] "RemoveContainer" containerID="22d4a3df3b8da0d3a7089e100b70f5eb6f45762cc59f432226a8ced697a6a8d9" Mar 19 09:49:07 crc kubenswrapper[4835]: I0319 09:49:07.428378 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:49:07 crc kubenswrapper[4835]: I0319 09:49:07.816450 4835 scope.go:117] "RemoveContainer" containerID="4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844" Mar 19 09:49:07 crc kubenswrapper[4835]: E0319 09:49:07.817087 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74f47f4bf4-6gctg_openstack(02257820-75fc-4616-82d0-5c1186508556)\"" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" podUID="02257820-75fc-4616-82d0-5c1186508556" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.175109 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.175427 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.176267 4835 scope.go:117] "RemoveContainer" containerID="78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636" Mar 19 09:49:09 crc kubenswrapper[4835]: E0319 09:49:09.176503 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5757d74498-tjqm4_openstack(3d139802-4569-4073-b80c-4807478db41a)\"" pod="openstack/heat-api-5757d74498-tjqm4" podUID="3d139802-4569-4073-b80c-4807478db41a" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.220807 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.220853 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:49:09 crc kubenswrapper[4835]: I0319 09:49:09.221894 4835 scope.go:117] "RemoveContainer" containerID="4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844" Mar 19 09:49:09 crc kubenswrapper[4835]: E0319 09:49:09.222212 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-74f47f4bf4-6gctg_openstack(02257820-75fc-4616-82d0-5c1186508556)\"" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" podUID="02257820-75fc-4616-82d0-5c1186508556" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.305145 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.500895 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501130 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwlln\" (UniqueName: \"kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501198 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501223 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501338 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501423 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle\") pod \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\" (UID: \"15d728ef-66c1-4e0e-9461-f8fb26f62e41\") " Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.501690 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.502830 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.502859 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15d728ef-66c1-4e0e-9461-f8fb26f62e41-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.513974 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln" (OuterVolumeSpecName: "kube-api-access-xwlln") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "kube-api-access-xwlln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.514651 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts" (OuterVolumeSpecName: "scripts") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.536827 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.606006 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwlln\" (UniqueName: \"kubernetes.io/projected/15d728ef-66c1-4e0e-9461-f8fb26f62e41-kube-api-access-xwlln\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.606408 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.606421 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.613473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.631726 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data" (OuterVolumeSpecName: "config-data") pod "15d728ef-66c1-4e0e-9461-f8fb26f62e41" (UID: "15d728ef-66c1-4e0e-9461-f8fb26f62e41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.708222 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.708429 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15d728ef-66c1-4e0e-9461-f8fb26f62e41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.849149 4835 generic.go:334] "Generic (PLEG): container finished" podID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerID="dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1" exitCode=0 Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.849190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerDied","Data":"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1"} Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.849214 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.849226 4835 scope.go:117] "RemoveContainer" containerID="8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.849216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15d728ef-66c1-4e0e-9461-f8fb26f62e41","Type":"ContainerDied","Data":"1b6a8863e1b335f06af5b7ab55d3ceb610941866e78a11acc6bae5ce7f968011"} Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.934942 4835 scope.go:117] "RemoveContainer" containerID="045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.945342 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.963395 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.964858 4835 scope.go:117] "RemoveContainer" containerID="dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.992851 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993342 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="sg-core" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993361 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="sg-core" Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993376 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" containerName="heat-api" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993382 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" containerName="heat-api" Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993399 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-central-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993407 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-central-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993429 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ce99fa-0f33-465a-9610-1a935668ec05" containerName="heat-cfnapi" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993435 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ce99fa-0f33-465a-9610-1a935668ec05" containerName="heat-cfnapi" Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993449 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="proxy-httpd" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993454 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="proxy-httpd" Mar 19 09:49:10 crc kubenswrapper[4835]: E0319 09:49:10.993475 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-notification-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993481 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-notification-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993678 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-central-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993695 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="ceilometer-notification-agent" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993708 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ee8b70-a6eb-4d4e-8615-4f122543ba5a" containerName="heat-api" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993716 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ce99fa-0f33-465a-9610-1a935668ec05" containerName="heat-cfnapi" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993730 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="proxy-httpd" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.993758 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" containerName="sg-core" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.995567 4835 scope.go:117] "RemoveContainer" containerID="b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.995625 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:10 crc kubenswrapper[4835]: I0319 09:49:10.998385 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.002830 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.017442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.049659 4835 scope.go:117] "RemoveContainer" containerID="8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b" Mar 19 09:49:11 crc kubenswrapper[4835]: E0319 09:49:11.050171 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b\": container with ID starting with 8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b not found: ID does not exist" containerID="8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.050205 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b"} err="failed to get container status \"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b\": rpc error: code = NotFound desc = could not find container \"8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b\": container with ID starting with 8932271d7397e07cf0a4f74f30148586f0936078e93bdb7a59bd50272eae536b not found: ID does not exist" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.050225 4835 scope.go:117] "RemoveContainer" containerID="045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105" Mar 19 09:49:11 crc kubenswrapper[4835]: E0319 09:49:11.051902 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105\": container with ID starting with 045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105 not found: ID does not exist" containerID="045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.051942 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105"} err="failed to get container status \"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105\": rpc error: code = NotFound desc = could not find container \"045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105\": container with ID starting with 045e31ee25ef1adc5e40d3dbb39baa7cb2a34e32cc5dbda65a35f00fed1ec105 not found: ID does not exist" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.051964 4835 scope.go:117] "RemoveContainer" containerID="dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1" Mar 19 09:49:11 crc kubenswrapper[4835]: E0319 09:49:11.052638 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1\": container with ID starting with dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1 not found: ID does not exist" containerID="dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.052664 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1"} err="failed to get container status \"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1\": rpc error: code = NotFound desc = could not find container \"dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1\": container with ID starting with dcae2148f34099380d84252df61ed326c02d0f29ecaccddf6d51087c96ddcad1 not found: ID does not exist" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.052682 4835 scope.go:117] "RemoveContainer" containerID="b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c" Mar 19 09:49:11 crc kubenswrapper[4835]: E0319 09:49:11.053277 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c\": container with ID starting with b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c not found: ID does not exist" containerID="b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.053300 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c"} err="failed to get container status \"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c\": rpc error: code = NotFound desc = could not find container \"b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c\": container with ID starting with b35ecc59e6adf4a4afd17bb7f1c2518b1996fb4f5ea531da5bf6153c7b1ef29c not found: ID does not exist" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.116830 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.116980 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.117026 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.117065 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxswn\" (UniqueName: \"kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.117132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.117169 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.117495 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.140815 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:11 crc kubenswrapper[4835]: E0319 09:49:11.141757 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-cxswn log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220116 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220366 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220385 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxswn\" (UniqueName: \"kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220458 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220790 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220824 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.220873 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.225758 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.226618 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.227062 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.228020 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.240648 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxswn\" (UniqueName: \"kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn\") pod \"ceilometer-0\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.862755 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:11 crc kubenswrapper[4835]: I0319 09:49:11.880324 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.041904 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042439 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042675 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxswn\" (UniqueName: \"kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042769 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.042946 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml\") pod \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\" (UID: \"6a56f8f1-c60a-4b64-a8af-dccdb247c6ee\") " Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.043503 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.043704 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.044815 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.044855 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.046850 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data" (OuterVolumeSpecName: "config-data") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.049889 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.049998 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.064613 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts" (OuterVolumeSpecName: "scripts") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.064691 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn" (OuterVolumeSpecName: "kube-api-access-cxswn") pod "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" (UID: "6a56f8f1-c60a-4b64-a8af-dccdb247c6ee"). InnerVolumeSpecName "kube-api-access-cxswn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.147147 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxswn\" (UniqueName: \"kubernetes.io/projected/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-kube-api-access-cxswn\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.147195 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.147209 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.147219 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.147230 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.416390 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d728ef-66c1-4e0e-9461-f8fb26f62e41" path="/var/lib/kubelet/pods/15d728ef-66c1-4e0e-9461-f8fb26f62e41/volumes" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.899237 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.960243 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.978416 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.993210 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:12 crc kubenswrapper[4835]: I0319 09:49:12.996737 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.003492 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.004334 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.030517 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066226 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066317 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066441 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066488 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066527 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67pt\" (UniqueName: \"kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.066565 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168108 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168165 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67pt\" (UniqueName: \"kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168363 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.168809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.169024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.175290 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.175399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.175605 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.175702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.189046 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67pt\" (UniqueName: \"kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt\") pod \"ceilometer-0\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.334753 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:13 crc kubenswrapper[4835]: I0319 09:49:13.933107 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.118612 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.226609 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.261120 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:49:14 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:49:14 crc kubenswrapper[4835]: > Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.287221 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.287471 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6ffb669df8-qcq6h" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerName="heat-engine" containerID="cri-o://a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" gracePeriod=60 Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.413530 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a56f8f1-c60a-4b64-a8af-dccdb247c6ee" path="/var/lib/kubelet/pods/6a56f8f1-c60a-4b64-a8af-dccdb247c6ee/volumes" Mar 19 09:49:14 crc kubenswrapper[4835]: I0319 09:49:14.971934 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerStarted","Data":"5f24481f59209ed33048f806fbdd65ac9378297e79dce087be9ee0025c771bf7"} Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.303903 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.356566 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.582833 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.717215 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.994119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" event={"ID":"02257820-75fc-4616-82d0-5c1186508556","Type":"ContainerDied","Data":"edc593b705ed4b7c8d270efe146bb580995522d3824a7402b39788e11446c3c8"} Mar 19 09:49:15 crc kubenswrapper[4835]: I0319 09:49:15.994367 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc593b705ed4b7c8d270efe146bb580995522d3824a7402b39788e11446c3c8" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.031780 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerStarted","Data":"e8a57d5295d0b06faf4fa322e20bb248db0a809a95e67febff392ee2c29f6787"} Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.036347 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.051507 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"636dd5f8-5cc2-46f4-84ac-a094b6881a4f","Type":"ContainerStarted","Data":"c43edb8f33e6553b787a15ef5ccd79e5bb091829f8686be42537bd0a7a64179a"} Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.060298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data\") pod \"02257820-75fc-4616-82d0-5c1186508556\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.060506 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom\") pod \"02257820-75fc-4616-82d0-5c1186508556\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.060581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrpf8\" (UniqueName: \"kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8\") pod \"02257820-75fc-4616-82d0-5c1186508556\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.060668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle\") pod \"02257820-75fc-4616-82d0-5c1186508556\" (UID: \"02257820-75fc-4616-82d0-5c1186508556\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.076733 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8" (OuterVolumeSpecName: "kube-api-access-hrpf8") pod "02257820-75fc-4616-82d0-5c1186508556" (UID: "02257820-75fc-4616-82d0-5c1186508556"). InnerVolumeSpecName "kube-api-access-hrpf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.087211 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "02257820-75fc-4616-82d0-5c1186508556" (UID: "02257820-75fc-4616-82d0-5c1186508556"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.131797 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.702161244 podStartE2EDuration="37.121718582s" podCreationTimestamp="2026-03-19 09:48:39 +0000 UTC" firstStartedPulling="2026-03-19 09:48:40.349828772 +0000 UTC m=+1575.198427359" lastFinishedPulling="2026-03-19 09:49:14.76938612 +0000 UTC m=+1609.617984697" observedRunningTime="2026-03-19 09:49:16.114283234 +0000 UTC m=+1610.962881821" watchObservedRunningTime="2026-03-19 09:49:16.121718582 +0000 UTC m=+1610.970317169" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.163367 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.163402 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrpf8\" (UniqueName: \"kubernetes.io/projected/02257820-75fc-4616-82d0-5c1186508556-kube-api-access-hrpf8\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.220001 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02257820-75fc-4616-82d0-5c1186508556" (UID: "02257820-75fc-4616-82d0-5c1186508556"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.271127 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.283606 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data" (OuterVolumeSpecName: "config-data") pod "02257820-75fc-4616-82d0-5c1186508556" (UID: "02257820-75fc-4616-82d0-5c1186508556"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.307136 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.373001 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom\") pod \"3d139802-4569-4073-b80c-4807478db41a\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.373690 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsfvn\" (UniqueName: \"kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn\") pod \"3d139802-4569-4073-b80c-4807478db41a\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.373907 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data\") pod \"3d139802-4569-4073-b80c-4807478db41a\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.373939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle\") pod \"3d139802-4569-4073-b80c-4807478db41a\" (UID: \"3d139802-4569-4073-b80c-4807478db41a\") " Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.378324 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02257820-75fc-4616-82d0-5c1186508556-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.378859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d139802-4569-4073-b80c-4807478db41a" (UID: "3d139802-4569-4073-b80c-4807478db41a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.391918 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn" (OuterVolumeSpecName: "kube-api-access-tsfvn") pod "3d139802-4569-4073-b80c-4807478db41a" (UID: "3d139802-4569-4073-b80c-4807478db41a"). InnerVolumeSpecName "kube-api-access-tsfvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.436099 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d139802-4569-4073-b80c-4807478db41a" (UID: "3d139802-4569-4073-b80c-4807478db41a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.473954 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data" (OuterVolumeSpecName: "config-data") pod "3d139802-4569-4073-b80c-4807478db41a" (UID: "3d139802-4569-4073-b80c-4807478db41a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.482169 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.482203 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.482218 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d139802-4569-4073-b80c-4807478db41a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:16 crc kubenswrapper[4835]: I0319 09:49:16.482230 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsfvn\" (UniqueName: \"kubernetes.io/projected/3d139802-4569-4073-b80c-4807478db41a-kube-api-access-tsfvn\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.063412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerStarted","Data":"0f7950cc20013a190bd603d82491b3bdbe2b96255f9910fa97820beaf5cd2cd1"} Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.064780 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-74f47f4bf4-6gctg" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.064836 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5757d74498-tjqm4" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.064774 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5757d74498-tjqm4" event={"ID":"3d139802-4569-4073-b80c-4807478db41a","Type":"ContainerDied","Data":"1789009b0c01035fcf09fd2ac5e8649f7318735c7d5e447d25bc67a65f952fa8"} Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.064905 4835 scope.go:117] "RemoveContainer" containerID="78d3673c4a84e212ec1f36f45714937b863839719c664fadb8602ab935c66636" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.125763 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.130036 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-74f47f4bf4-6gctg"] Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.210325 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.244913 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5757d74498-tjqm4"] Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.396510 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.412076 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="d3ca55c1-1697-4203-bcec-3a9e5bd64c59" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.230:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.413877 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.422678 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.422944 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6ffb669df8-qcq6h" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerName="heat-engine" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.651922 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.654322 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.654371 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.654405 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.654415 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.654465 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.654475 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.654886 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.654908 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: E0319 09:49:17.655203 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.655220 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.655678 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d139802-4569-4073-b80c-4807478db41a" containerName="heat-api" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.655702 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="02257820-75fc-4616-82d0-5c1186508556" containerName="heat-cfnapi" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.657314 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.695117 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.725380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntb9\" (UniqueName: \"kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.725492 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.725570 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.827220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntb9\" (UniqueName: \"kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.827320 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.827396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.827872 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.829268 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.851389 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntb9\" (UniqueName: \"kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9\") pod \"redhat-marketplace-zbxd8\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:17 crc kubenswrapper[4835]: I0319 09:49:17.987880 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.093479 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerStarted","Data":"d53e766765cca6556c78ec739eedf608f4f47cd66149a3c21d3d9dc064a880c0"} Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.338853 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.339360 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-log" containerID="cri-o://0176626d9fd33b2d9cd7286414558f248abac68798ee6b1d280ebbb62215e2c5" gracePeriod=30 Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.339908 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-httpd" containerID="cri-o://ff636b8ace26f3a68c898e60bc6db9609e6fb220bca6425a914cba993210d0b7" gracePeriod=30 Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.409407 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:49:18 crc kubenswrapper[4835]: E0319 09:49:18.409677 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.422606 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d3ca55c1-1697-4203-bcec-3a9e5bd64c59" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.230:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.437857 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02257820-75fc-4616-82d0-5c1186508556" path="/var/lib/kubelet/pods/02257820-75fc-4616-82d0-5c1186508556/volumes" Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.438865 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d139802-4569-4073-b80c-4807478db41a" path="/var/lib/kubelet/pods/3d139802-4569-4073-b80c-4807478db41a/volumes" Mar 19 09:49:18 crc kubenswrapper[4835]: W0319 09:49:18.757031 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0722dbab_2741_445d_8585_04eaf3738c0d.slice/crio-1d4e761a01920eae9082e14e20b564baa5580af6ad598017d6ee989319921941 WatchSource:0}: Error finding container 1d4e761a01920eae9082e14e20b564baa5580af6ad598017d6ee989319921941: Status 404 returned error can't find the container with id 1d4e761a01920eae9082e14e20b564baa5580af6ad598017d6ee989319921941 Mar 19 09:49:18 crc kubenswrapper[4835]: I0319 09:49:18.774659 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:19 crc kubenswrapper[4835]: I0319 09:49:19.119431 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerStarted","Data":"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f"} Mar 19 09:49:19 crc kubenswrapper[4835]: I0319 09:49:19.119725 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerStarted","Data":"1d4e761a01920eae9082e14e20b564baa5580af6ad598017d6ee989319921941"} Mar 19 09:49:19 crc kubenswrapper[4835]: I0319 09:49:19.126576 4835 generic.go:334] "Generic (PLEG): container finished" podID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerID="0176626d9fd33b2d9cd7286414558f248abac68798ee6b1d280ebbb62215e2c5" exitCode=143 Mar 19 09:49:19 crc kubenswrapper[4835]: I0319 09:49:19.126662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerDied","Data":"0176626d9fd33b2d9cd7286414558f248abac68798ee6b1d280ebbb62215e2c5"} Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.141912 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerStarted","Data":"8e215d67922c00be594b2f7a755500093b9ed0d10cf8a7e9a25864736f641da3"} Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.142678 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.142088 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="sg-core" containerID="cri-o://d53e766765cca6556c78ec739eedf608f4f47cd66149a3c21d3d9dc064a880c0" gracePeriod=30 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.142144 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-notification-agent" containerID="cri-o://0f7950cc20013a190bd603d82491b3bdbe2b96255f9910fa97820beaf5cd2cd1" gracePeriod=30 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.142015 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-central-agent" containerID="cri-o://e8a57d5295d0b06faf4fa322e20bb248db0a809a95e67febff392ee2c29f6787" gracePeriod=30 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.142105 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="proxy-httpd" containerID="cri-o://8e215d67922c00be594b2f7a755500093b9ed0d10cf8a7e9a25864736f641da3" gracePeriod=30 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.144174 4835 generic.go:334] "Generic (PLEG): container finished" podID="0722dbab-2741-445d-8585-04eaf3738c0d" containerID="a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f" exitCode=0 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.144225 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerDied","Data":"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f"} Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.173780 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.1943946739999998 podStartE2EDuration="8.173762314s" podCreationTimestamp="2026-03-19 09:49:12 +0000 UTC" firstStartedPulling="2026-03-19 09:49:13.943661118 +0000 UTC m=+1608.792259705" lastFinishedPulling="2026-03-19 09:49:18.923028758 +0000 UTC m=+1613.771627345" observedRunningTime="2026-03-19 09:49:20.16532089 +0000 UTC m=+1615.013919477" watchObservedRunningTime="2026-03-19 09:49:20.173762314 +0000 UTC m=+1615.022360901" Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.685487 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.685788 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-log" containerID="cri-o://6145fee7cb692671de67b9f4efa736538c1b33e71134adacfff51490c26d1bb3" gracePeriod=30 Mar 19 09:49:20 crc kubenswrapper[4835]: I0319 09:49:20.685873 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-httpd" containerID="cri-o://0147565355a1e16a24b8881476ad11558f8d50a68c44e87936dd43a70e0530ff" gracePeriod=30 Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.159203 4835 generic.go:334] "Generic (PLEG): container finished" podID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerID="6145fee7cb692671de67b9f4efa736538c1b33e71134adacfff51490c26d1bb3" exitCode=143 Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.159337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerDied","Data":"6145fee7cb692671de67b9f4efa736538c1b33e71134adacfff51490c26d1bb3"} Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162174 4835 generic.go:334] "Generic (PLEG): container finished" podID="8613286f-5060-408e-bd42-882380ee4531" containerID="8e215d67922c00be594b2f7a755500093b9ed0d10cf8a7e9a25864736f641da3" exitCode=0 Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162203 4835 generic.go:334] "Generic (PLEG): container finished" podID="8613286f-5060-408e-bd42-882380ee4531" containerID="d53e766765cca6556c78ec739eedf608f4f47cd66149a3c21d3d9dc064a880c0" exitCode=2 Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162214 4835 generic.go:334] "Generic (PLEG): container finished" podID="8613286f-5060-408e-bd42-882380ee4531" containerID="0f7950cc20013a190bd603d82491b3bdbe2b96255f9910fa97820beaf5cd2cd1" exitCode=0 Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162235 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerDied","Data":"8e215d67922c00be594b2f7a755500093b9ed0d10cf8a7e9a25864736f641da3"} Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162263 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerDied","Data":"d53e766765cca6556c78ec739eedf608f4f47cd66149a3c21d3d9dc064a880c0"} Mar 19 09:49:21 crc kubenswrapper[4835]: I0319 09:49:21.162274 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerDied","Data":"0f7950cc20013a190bd603d82491b3bdbe2b96255f9910fa97820beaf5cd2cd1"} Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.198338 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerStarted","Data":"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd"} Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.211508 4835 generic.go:334] "Generic (PLEG): container finished" podID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerID="ff636b8ace26f3a68c898e60bc6db9609e6fb220bca6425a914cba993210d0b7" exitCode=0 Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.211560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerDied","Data":"ff636b8ace26f3a68c898e60bc6db9609e6fb220bca6425a914cba993210d0b7"} Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.363266 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.422870 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="d3ca55c1-1697-4203-bcec-3a9e5bd64c59" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.230:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.455833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.455925 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.455961 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.456782 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs" (OuterVolumeSpecName: "logs") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.457153 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.457226 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.457275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.457374 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.457456 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6dp8\" (UniqueName: \"kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8\") pod \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\" (UID: \"bf435b6c-8025-4835-9e66-dad5ca6a95c3\") " Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.459006 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.463018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.469886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts" (OuterVolumeSpecName: "scripts") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.504112 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8" (OuterVolumeSpecName: "kube-api-access-q6dp8") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "kube-api-access-q6dp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.561540 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.568881 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bf435b6c-8025-4835-9e66-dad5ca6a95c3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.568903 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6dp8\" (UniqueName: \"kubernetes.io/projected/bf435b6c-8025-4835-9e66-dad5ca6a95c3-kube-api-access-q6dp8\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.599218 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.622688 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.647084 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.656897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data" (OuterVolumeSpecName: "config-data") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.696308 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.696350 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.696362 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf435b6c-8025-4835-9e66-dad5ca6a95c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:22 crc kubenswrapper[4835]: I0319 09:49:22.911058 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688" (OuterVolumeSpecName: "glance") pod "bf435b6c-8025-4835-9e66-dad5ca6a95c3" (UID: "bf435b6c-8025-4835-9e66-dad5ca6a95c3"). InnerVolumeSpecName "pvc-f195f04e-c0f1-495c-870b-981275804688". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.004139 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") on node \"crc\" " Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.037441 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.037619 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f195f04e-c0f1-495c-870b-981275804688" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688") on node "crc" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.106206 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.249627 4835 generic.go:334] "Generic (PLEG): container finished" podID="0722dbab-2741-445d-8585-04eaf3738c0d" containerID="a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd" exitCode=0 Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.250859 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerDied","Data":"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd"} Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.264685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bf435b6c-8025-4835-9e66-dad5ca6a95c3","Type":"ContainerDied","Data":"ac7e4c0970e84d95e6c65037a78e2addc754236d5e42a603a6f92c7a801534e1"} Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.264734 4835 scope.go:117] "RemoveContainer" containerID="ff636b8ace26f3a68c898e60bc6db9609e6fb220bca6425a914cba993210d0b7" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.264859 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.298687 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gdrnz"] Mar 19 09:49:23 crc kubenswrapper[4835]: E0319 09:49:23.299611 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-log" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.299634 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-log" Mar 19 09:49:23 crc kubenswrapper[4835]: E0319 09:49:23.299667 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-httpd" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.299678 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-httpd" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.300026 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-log" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.300052 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" containerName="glance-httpd" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.301421 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.406450 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gdrnz"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.467487 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vshkp\" (UniqueName: \"kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.511850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.524533 4835 scope.go:117] "RemoveContainer" containerID="0176626d9fd33b2d9cd7286414558f248abac68798ee6b1d280ebbb62215e2c5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.549822 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.598800 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.617573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vshkp\" (UniqueName: \"kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.617820 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.623279 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.644373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vshkp\" (UniqueName: \"kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp\") pod \"nova-api-db-create-gdrnz\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.650948 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-rvdwp"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.652496 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.654081 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.663493 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.666021 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.675630 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.675794 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-62ed-account-create-update-8hlr5"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.677350 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.677576 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.680464 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.701330 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rvdwp"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.722721 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723085 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723128 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd8s8\" (UniqueName: \"kubernetes.io/projected/48489f77-65d9-4e0d-88f5-951d6928dcb3-kube-api-access-qd8s8\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723149 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-logs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723184 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgs2n\" (UniqueName: \"kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723211 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723243 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723263 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723281 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723303 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqk7z\" (UniqueName: \"kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.723397 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.726800 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.794811 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-62ed-account-create-update-8hlr5"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.824985 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kxp5r"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.826364 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831082 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqk7z\" (UniqueName: \"kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831123 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831228 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831263 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd8s8\" (UniqueName: \"kubernetes.io/projected/48489f77-65d9-4e0d-88f5-951d6928dcb3-kube-api-access-qd8s8\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831313 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-logs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgs2n\" (UniqueName: \"kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831372 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831400 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831424 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831446 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.831460 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.832326 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.832874 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.835116 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48489f77-65d9-4e0d-88f5-951d6928dcb3-logs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.836347 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.845917 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.847301 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d183d018680155306ca255d36c5357de351d0f1f91a4bb616234d2bb906544a/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.849799 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.850125 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-scripts\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.864534 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-config-data\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.865503 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd8s8\" (UniqueName: \"kubernetes.io/projected/48489f77-65d9-4e0d-88f5-951d6928dcb3-kube-api-access-qd8s8\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.882091 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgs2n\" (UniqueName: \"kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n\") pod \"nova-cell0-db-create-rvdwp\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.882160 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kxp5r"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.901911 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqk7z\" (UniqueName: \"kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z\") pod \"nova-api-62ed-account-create-update-8hlr5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.910614 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48489f77-65d9-4e0d-88f5-951d6928dcb3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.948791 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d261-account-create-update-tskmn"] Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.950205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:23 crc kubenswrapper[4835]: I0319 09:49:23.962225 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.032064 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d261-account-create-update-tskmn"] Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.037674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5fd\" (UniqueName: \"kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.037725 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.037810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.037853 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2m2t\" (UniqueName: \"kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.097369 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.109861 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f195f04e-c0f1-495c-870b-981275804688\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f195f04e-c0f1-495c-870b-981275804688\") pod \"glance-default-external-api-0\" (UID: \"48489f77-65d9-4e0d-88f5-951d6928dcb3\") " pod="openstack/glance-default-external-api-0" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.135766 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.143366 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.143447 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2m2t\" (UniqueName: \"kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.143766 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5fd\" (UniqueName: \"kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.143824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.144809 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.146330 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.149820 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-38a1-account-create-update-csb2n"] Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.151503 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.154254 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.168986 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-38a1-account-create-update-csb2n"] Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.203457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2m2t\" (UniqueName: \"kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t\") pod \"nova-cell0-d261-account-create-update-tskmn\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.237437 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" probeResult="failure" output=< Mar 19 09:49:24 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:49:24 crc kubenswrapper[4835]: > Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.254424 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpgf7\" (UniqueName: \"kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.254550 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.290907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5fd\" (UniqueName: \"kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd\") pod \"nova-cell1-db-create-kxp5r\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.352204 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.357349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.358626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.377430 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpgf7\" (UniqueName: \"kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.414380 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.419963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpgf7\" (UniqueName: \"kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7\") pod \"nova-cell1-38a1-account-create-update-csb2n\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.461438 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf435b6c-8025-4835-9e66-dad5ca6a95c3" path="/var/lib/kubelet/pods/bf435b6c-8025-4835-9e66-dad5ca6a95c3/volumes" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.570190 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.570690 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gdrnz"] Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.677042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:24 crc kubenswrapper[4835]: I0319 09:49:24.937643 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-rvdwp"] Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.124444 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-62ed-account-create-update-8hlr5"] Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.411656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gdrnz" event={"ID":"f917b1a5-922f-444e-b621-b7a89e8df050","Type":"ContainerStarted","Data":"6d0fa28b880c3a0a7ce0ed4caec5d24109a7dfc5741ff7f60c5b3b2c46f62de6"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.412019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gdrnz" event={"ID":"f917b1a5-922f-444e-b621-b7a89e8df050","Type":"ContainerStarted","Data":"dfd89894b217253ba47da3f19cb3d9e932ab1bfd844e822e4a678060fa1de8ee"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.425076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-62ed-account-create-update-8hlr5" event={"ID":"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5","Type":"ContainerStarted","Data":"56c09323454810f084829ddc2082f73e1b0fc342585c9d8a906739919dd14759"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.432340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerStarted","Data":"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.438492 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-gdrnz" podStartSLOduration=2.4384713 podStartE2EDuration="2.4384713s" podCreationTimestamp="2026-03-19 09:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:25.432903671 +0000 UTC m=+1620.281502268" watchObservedRunningTime="2026-03-19 09:49:25.4384713 +0000 UTC m=+1620.287069887" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.453194 4835 generic.go:334] "Generic (PLEG): container finished" podID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerID="0147565355a1e16a24b8881476ad11558f8d50a68c44e87936dd43a70e0530ff" exitCode=0 Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.453273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerDied","Data":"0147565355a1e16a24b8881476ad11558f8d50a68c44e87936dd43a70e0530ff"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.462457 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rvdwp" event={"ID":"755c8e66-e5e7-491b-9d77-a4b702ef6b92","Type":"ContainerStarted","Data":"39eca1cbcc12413b3409d31da7b6a1fa5e14b10da642233afc209e77be3fdea7"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.462505 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rvdwp" event={"ID":"755c8e66-e5e7-491b-9d77-a4b702ef6b92","Type":"ContainerStarted","Data":"d03f62a3f5b1cb93a4acb4ec22252e892299c4fe1e2a2e751c3f2f1cf0a962c3"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.479097 4835 generic.go:334] "Generic (PLEG): container finished" podID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerID="a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" exitCode=0 Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.479169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ffb669df8-qcq6h" event={"ID":"1342aee7-66dc-4df2-b44a-970b332ffc8d","Type":"ContainerDied","Data":"a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d"} Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.506097 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbxd8" podStartSLOduration=4.878219189 podStartE2EDuration="8.506079059s" podCreationTimestamp="2026-03-19 09:49:17 +0000 UTC" firstStartedPulling="2026-03-19 09:49:20.147610578 +0000 UTC m=+1614.996209165" lastFinishedPulling="2026-03-19 09:49:23.775470448 +0000 UTC m=+1618.624069035" observedRunningTime="2026-03-19 09:49:25.45840203 +0000 UTC m=+1620.307000637" watchObservedRunningTime="2026-03-19 09:49:25.506079059 +0000 UTC m=+1620.354677646" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.522248 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-rvdwp" podStartSLOduration=2.522232119 podStartE2EDuration="2.522232119s" podCreationTimestamp="2026-03-19 09:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:25.499228047 +0000 UTC m=+1620.347826634" watchObservedRunningTime="2026-03-19 09:49:25.522232119 +0000 UTC m=+1620.370830706" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.649203 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.736527 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.736910 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.737081 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.737127 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.737257 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.738205 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs" (OuterVolumeSpecName: "logs") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.738582 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.738660 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.738819 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvpq\" (UniqueName: \"kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq\") pod \"236685e9-820f-4a95-85fc-c47b24cc0a73\" (UID: \"236685e9-820f-4a95-85fc-c47b24cc0a73\") " Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.739917 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.740509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.764857 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts" (OuterVolumeSpecName: "scripts") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.767938 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq" (OuterVolumeSpecName: "kube-api-access-ltvpq") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "kube-api-access-ltvpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.831680 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.841531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d261-account-create-update-tskmn"] Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.845800 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.845838 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.845852 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltvpq\" (UniqueName: \"kubernetes.io/projected/236685e9-820f-4a95-85fc-c47b24cc0a73-kube-api-access-ltvpq\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.845865 4835 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/236685e9-820f-4a95-85fc-c47b24cc0a73-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.862505 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.884061 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a" (OuterVolumeSpecName: "glance") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "pvc-9eba352b-5554-4046-b92a-7c10e5b6276a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.888504 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data" (OuterVolumeSpecName: "config-data") pod "236685e9-820f-4a95-85fc-c47b24cc0a73" (UID: "236685e9-820f-4a95-85fc-c47b24cc0a73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.957460 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.957496 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/236685e9-820f-4a95-85fc-c47b24cc0a73-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:25 crc kubenswrapper[4835]: I0319 09:49:25.957526 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") on node \"crc\" " Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.042368 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.042798 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9eba352b-5554-4046-b92a-7c10e5b6276a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a") on node "crc" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.048878 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.059802 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.228866 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kxp5r"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.254723 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-38a1-account-create-update-csb2n"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.258907 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.368047 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8q5g\" (UniqueName: \"kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g\") pod \"1342aee7-66dc-4df2-b44a-970b332ffc8d\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.368450 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom\") pod \"1342aee7-66dc-4df2-b44a-970b332ffc8d\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.368689 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle\") pod \"1342aee7-66dc-4df2-b44a-970b332ffc8d\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.368811 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data\") pod \"1342aee7-66dc-4df2-b44a-970b332ffc8d\" (UID: \"1342aee7-66dc-4df2-b44a-970b332ffc8d\") " Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.379939 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g" (OuterVolumeSpecName: "kube-api-access-n8q5g") pod "1342aee7-66dc-4df2-b44a-970b332ffc8d" (UID: "1342aee7-66dc-4df2-b44a-970b332ffc8d"). InnerVolumeSpecName "kube-api-access-n8q5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.380193 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1342aee7-66dc-4df2-b44a-970b332ffc8d" (UID: "1342aee7-66dc-4df2-b44a-970b332ffc8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.474587 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8q5g\" (UniqueName: \"kubernetes.io/projected/1342aee7-66dc-4df2-b44a-970b332ffc8d-kube-api-access-n8q5g\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.474629 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.486645 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data" (OuterVolumeSpecName: "config-data") pod "1342aee7-66dc-4df2-b44a-970b332ffc8d" (UID: "1342aee7-66dc-4df2-b44a-970b332ffc8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.498894 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1342aee7-66dc-4df2-b44a-970b332ffc8d" (UID: "1342aee7-66dc-4df2-b44a-970b332ffc8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.588321 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.588382 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1342aee7-66dc-4df2-b44a-970b332ffc8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.591265 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d261-account-create-update-tskmn" event={"ID":"25d11e7d-9d08-4f99-b447-0a80feec9c95","Type":"ContainerStarted","Data":"62013a64e403873a78d0a883c182c65f16986079912450b96c7878082449b42c"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.591318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d261-account-create-update-tskmn" event={"ID":"25d11e7d-9d08-4f99-b447-0a80feec9c95","Type":"ContainerStarted","Data":"3dae514869909c43ff008a9f0dcc4f9589942799725a7a39ed9097e64315a000"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.596272 4835 generic.go:334] "Generic (PLEG): container finished" podID="755c8e66-e5e7-491b-9d77-a4b702ef6b92" containerID="39eca1cbcc12413b3409d31da7b6a1fa5e14b10da642233afc209e77be3fdea7" exitCode=0 Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.596509 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rvdwp" event={"ID":"755c8e66-e5e7-491b-9d77-a4b702ef6b92","Type":"ContainerDied","Data":"39eca1cbcc12413b3409d31da7b6a1fa5e14b10da642233afc209e77be3fdea7"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.607203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48489f77-65d9-4e0d-88f5-951d6928dcb3","Type":"ContainerStarted","Data":"70035a95abd9612ca29d19d772ee6028d15d2ebedc4be1b1ca76e6ec3664e746"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.611538 4835 generic.go:334] "Generic (PLEG): container finished" podID="f917b1a5-922f-444e-b621-b7a89e8df050" containerID="6d0fa28b880c3a0a7ce0ed4caec5d24109a7dfc5741ff7f60c5b3b2c46f62de6" exitCode=0 Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.612049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gdrnz" event={"ID":"f917b1a5-922f-444e-b621-b7a89e8df050","Type":"ContainerDied","Data":"6d0fa28b880c3a0a7ce0ed4caec5d24109a7dfc5741ff7f60c5b3b2c46f62de6"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.616041 4835 generic.go:334] "Generic (PLEG): container finished" podID="6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" containerID="776d6925a324fb465f8d28244bd55ae19a439e475e5f0aba8f8306aa99d0f8c8" exitCode=0 Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.616147 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-62ed-account-create-update-8hlr5" event={"ID":"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5","Type":"ContainerDied","Data":"776d6925a324fb465f8d28244bd55ae19a439e475e5f0aba8f8306aa99d0f8c8"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.628822 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kxp5r" event={"ID":"097405cc-a2f0-4f58-875e-b9433a53bde9","Type":"ContainerStarted","Data":"55acfce6940ce5b1a8adb679a6788fc7b781dcfbd38825fa3ea5d3329016d5c0"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.635241 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" event={"ID":"905929b3-4296-4ec7-9532-d7ad6e80ad84","Type":"ContainerStarted","Data":"0b48b90c8a80536a66d268ccd8e65ba3aba664036fb8c4b3c7e5e967aca53428"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.644643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"236685e9-820f-4a95-85fc-c47b24cc0a73","Type":"ContainerDied","Data":"bdb5b9aab1e7d9d20a30ba02a0bd6cb67f4b1a3d7411457aded4deac37be1795"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.644680 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.644699 4835 scope.go:117] "RemoveContainer" containerID="0147565355a1e16a24b8881476ad11558f8d50a68c44e87936dd43a70e0530ff" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.678112 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d261-account-create-update-tskmn" podStartSLOduration=3.678089251 podStartE2EDuration="3.678089251s" podCreationTimestamp="2026-03-19 09:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:26.617832806 +0000 UTC m=+1621.466431393" watchObservedRunningTime="2026-03-19 09:49:26.678089251 +0000 UTC m=+1621.526687838" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.719220 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6ffb669df8-qcq6h" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.719243 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6ffb669df8-qcq6h" event={"ID":"1342aee7-66dc-4df2-b44a-970b332ffc8d","Type":"ContainerDied","Data":"38534877503fd88b2a46dea46dd31fb7133bf40e642174cb24bfe770fdcbd001"} Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.756040 4835 scope.go:117] "RemoveContainer" containerID="6145fee7cb692671de67b9f4efa736538c1b33e71134adacfff51490c26d1bb3" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.851348 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.871918 4835 scope.go:117] "RemoveContainer" containerID="a9d7b18f48ece6410e4b81f72a2d34c9b47a755bf156c0def1a5db05c2152b5d" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.940886 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.979665 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:26 crc kubenswrapper[4835]: E0319 09:49:26.980798 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerName="heat-engine" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.980822 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerName="heat-engine" Mar 19 09:49:26 crc kubenswrapper[4835]: E0319 09:49:26.980866 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-httpd" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.980878 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-httpd" Mar 19 09:49:26 crc kubenswrapper[4835]: E0319 09:49:26.980899 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-log" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.980933 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-log" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.982395 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" containerName="heat-engine" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.982442 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-httpd" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.982463 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" containerName="glance-log" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.989454 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.994055 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.995120 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 09:49:26 crc kubenswrapper[4835]: I0319 09:49:26.995342 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.025275 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.074227 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6ffb669df8-qcq6h"] Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.104502 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.104960 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.105123 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8666\" (UniqueName: \"kubernetes.io/projected/ff2e9785-1044-4469-a034-9baeb46ff607-kube-api-access-d8666\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.106787 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.107118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.107380 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.107992 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.108226 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.210683 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211065 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211167 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8666\" (UniqueName: \"kubernetes.io/projected/ff2e9785-1044-4469-a034-9baeb46ff607-kube-api-access-d8666\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211407 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211488 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211569 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.211331 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.212913 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff2e9785-1044-4469-a034-9baeb46ff607-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.228477 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.228486 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.229158 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.238714 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff2e9785-1044-4469-a034-9baeb46ff607-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.249643 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.249870 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89205a9f8fafa059e3c8d5aabb19b28af3510e2ed70a778ed91ec4c662cc8e20/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.264381 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8666\" (UniqueName: \"kubernetes.io/projected/ff2e9785-1044-4469-a034-9baeb46ff607-kube-api-access-d8666\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.307097 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9eba352b-5554-4046-b92a-7c10e5b6276a\") pod \"glance-default-internal-api-0\" (UID: \"ff2e9785-1044-4469-a034-9baeb46ff607\") " pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.360905 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.783347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48489f77-65d9-4e0d-88f5-951d6928dcb3","Type":"ContainerStarted","Data":"5974c8f40a401df773dd9af6a91d018159565fb47569f34dba43c9b4e565e38e"} Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.811550 4835 generic.go:334] "Generic (PLEG): container finished" podID="097405cc-a2f0-4f58-875e-b9433a53bde9" containerID="b3f18845241f81076d63e42367c4a9c57e07b759e305dfdddecbe577b512e3a5" exitCode=0 Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.813232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kxp5r" event={"ID":"097405cc-a2f0-4f58-875e-b9433a53bde9","Type":"ContainerDied","Data":"b3f18845241f81076d63e42367c4a9c57e07b759e305dfdddecbe577b512e3a5"} Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.827780 4835 generic.go:334] "Generic (PLEG): container finished" podID="25d11e7d-9d08-4f99-b447-0a80feec9c95" containerID="62013a64e403873a78d0a883c182c65f16986079912450b96c7878082449b42c" exitCode=0 Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.827881 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d261-account-create-update-tskmn" event={"ID":"25d11e7d-9d08-4f99-b447-0a80feec9c95","Type":"ContainerDied","Data":"62013a64e403873a78d0a883c182c65f16986079912450b96c7878082449b42c"} Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.851794 4835 generic.go:334] "Generic (PLEG): container finished" podID="905929b3-4296-4ec7-9532-d7ad6e80ad84" containerID="e38fc7e1c8eceb13aaeca3ac1b819419c6c411847893dd642c51685b1de2119c" exitCode=0 Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.852531 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" event={"ID":"905929b3-4296-4ec7-9532-d7ad6e80ad84","Type":"ContainerDied","Data":"e38fc7e1c8eceb13aaeca3ac1b819419c6c411847893dd642c51685b1de2119c"} Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.989328 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:27 crc kubenswrapper[4835]: I0319 09:49:27.989488 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.117102 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.390245 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.608923 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1342aee7-66dc-4df2-b44a-970b332ffc8d" path="/var/lib/kubelet/pods/1342aee7-66dc-4df2-b44a-970b332ffc8d/volumes" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.610069 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236685e9-820f-4a95-85fc-c47b24cc0a73" path="/var/lib/kubelet/pods/236685e9-820f-4a95-85fc-c47b24cc0a73/volumes" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.842251 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.876910 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.884225 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.885825 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff2e9785-1044-4469-a034-9baeb46ff607","Type":"ContainerStarted","Data":"2dcef76ca9acc6e5855264d351a70303636eb6ecb60668281462370f1ce81295"} Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.891972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-rvdwp" event={"ID":"755c8e66-e5e7-491b-9d77-a4b702ef6b92","Type":"ContainerDied","Data":"d03f62a3f5b1cb93a4acb4ec22252e892299c4fe1e2a2e751c3f2f1cf0a962c3"} Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.892019 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03f62a3f5b1cb93a4acb4ec22252e892299c4fe1e2a2e751c3f2f1cf0a962c3" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.892086 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-rvdwp" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.903105 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gdrnz" event={"ID":"f917b1a5-922f-444e-b621-b7a89e8df050","Type":"ContainerDied","Data":"dfd89894b217253ba47da3f19cb3d9e932ab1bfd844e822e4a678060fa1de8ee"} Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.903149 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd89894b217253ba47da3f19cb3d9e932ab1bfd844e822e4a678060fa1de8ee" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.903218 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gdrnz" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.914858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-62ed-account-create-update-8hlr5" event={"ID":"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5","Type":"ContainerDied","Data":"56c09323454810f084829ddc2082f73e1b0fc342585c9d8a906739919dd14759"} Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.914918 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c09323454810f084829ddc2082f73e1b0fc342585c9d8a906739919dd14759" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.914997 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-62ed-account-create-update-8hlr5" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.916801 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts\") pod \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.916908 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts\") pod \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.916977 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgs2n\" (UniqueName: \"kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n\") pod \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\" (UID: \"755c8e66-e5e7-491b-9d77-a4b702ef6b92\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.917073 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqk7z\" (UniqueName: \"kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z\") pod \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\" (UID: \"6c3ca948-8ce8-402a-995a-ffdfe23ceaa5\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.917133 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vshkp\" (UniqueName: \"kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp\") pod \"f917b1a5-922f-444e-b621-b7a89e8df050\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.917274 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts\") pod \"f917b1a5-922f-444e-b621-b7a89e8df050\" (UID: \"f917b1a5-922f-444e-b621-b7a89e8df050\") " Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.918088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "755c8e66-e5e7-491b-9d77-a4b702ef6b92" (UID: "755c8e66-e5e7-491b-9d77-a4b702ef6b92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.918609 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/755c8e66-e5e7-491b-9d77-a4b702ef6b92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.918651 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" (UID: "6c3ca948-8ce8-402a-995a-ffdfe23ceaa5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.919327 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f917b1a5-922f-444e-b621-b7a89e8df050" (UID: "f917b1a5-922f-444e-b621-b7a89e8df050"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.935394 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z" (OuterVolumeSpecName: "kube-api-access-mqk7z") pod "6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" (UID: "6c3ca948-8ce8-402a-995a-ffdfe23ceaa5"). InnerVolumeSpecName "kube-api-access-mqk7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.935583 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp" (OuterVolumeSpecName: "kube-api-access-vshkp") pod "f917b1a5-922f-444e-b621-b7a89e8df050" (UID: "f917b1a5-922f-444e-b621-b7a89e8df050"). InnerVolumeSpecName "kube-api-access-vshkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:28 crc kubenswrapper[4835]: I0319 09:49:28.935607 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n" (OuterVolumeSpecName: "kube-api-access-kgs2n") pod "755c8e66-e5e7-491b-9d77-a4b702ef6b92" (UID: "755c8e66-e5e7-491b-9d77-a4b702ef6b92"). InnerVolumeSpecName "kube-api-access-kgs2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.019965 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgs2n\" (UniqueName: \"kubernetes.io/projected/755c8e66-e5e7-491b-9d77-a4b702ef6b92-kube-api-access-kgs2n\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.020268 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqk7z\" (UniqueName: \"kubernetes.io/projected/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-kube-api-access-mqk7z\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.020281 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vshkp\" (UniqueName: \"kubernetes.io/projected/f917b1a5-922f-444e-b621-b7a89e8df050-kube-api-access-vshkp\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.020290 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f917b1a5-922f-444e-b621-b7a89e8df050-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.020298 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.598604 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.737035 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.747576 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts\") pod \"25d11e7d-9d08-4f99-b447-0a80feec9c95\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.747641 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2m2t\" (UniqueName: \"kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t\") pod \"25d11e7d-9d08-4f99-b447-0a80feec9c95\" (UID: \"25d11e7d-9d08-4f99-b447-0a80feec9c95\") " Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.751917 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25d11e7d-9d08-4f99-b447-0a80feec9c95" (UID: "25d11e7d-9d08-4f99-b447-0a80feec9c95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.761022 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t" (OuterVolumeSpecName: "kube-api-access-m2m2t") pod "25d11e7d-9d08-4f99-b447-0a80feec9c95" (UID: "25d11e7d-9d08-4f99-b447-0a80feec9c95"). InnerVolumeSpecName "kube-api-access-m2m2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.849678 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpgf7\" (UniqueName: \"kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7\") pod \"905929b3-4296-4ec7-9532-d7ad6e80ad84\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.850956 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts\") pod \"905929b3-4296-4ec7-9532-d7ad6e80ad84\" (UID: \"905929b3-4296-4ec7-9532-d7ad6e80ad84\") " Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.852306 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25d11e7d-9d08-4f99-b447-0a80feec9c95-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.852333 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2m2t\" (UniqueName: \"kubernetes.io/projected/25d11e7d-9d08-4f99-b447-0a80feec9c95-kube-api-access-m2m2t\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.864080 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7" (OuterVolumeSpecName: "kube-api-access-zpgf7") pod "905929b3-4296-4ec7-9532-d7ad6e80ad84" (UID: "905929b3-4296-4ec7-9532-d7ad6e80ad84"). InnerVolumeSpecName "kube-api-access-zpgf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.870464 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "905929b3-4296-4ec7-9532-d7ad6e80ad84" (UID: "905929b3-4296-4ec7-9532-d7ad6e80ad84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.940049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kxp5r" event={"ID":"097405cc-a2f0-4f58-875e-b9433a53bde9","Type":"ContainerDied","Data":"55acfce6940ce5b1a8adb679a6788fc7b781dcfbd38825fa3ea5d3329016d5c0"} Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.940086 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55acfce6940ce5b1a8adb679a6788fc7b781dcfbd38825fa3ea5d3329016d5c0" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.941527 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d261-account-create-update-tskmn" event={"ID":"25d11e7d-9d08-4f99-b447-0a80feec9c95","Type":"ContainerDied","Data":"3dae514869909c43ff008a9f0dcc4f9589942799725a7a39ed9097e64315a000"} Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.941569 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dae514869909c43ff008a9f0dcc4f9589942799725a7a39ed9097e64315a000" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.941625 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d261-account-create-update-tskmn" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.952788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff2e9785-1044-4469-a034-9baeb46ff607","Type":"ContainerStarted","Data":"ac4da3e023f47301c4f22e4265a329ee11fec2d84349edf018a14f0b057cfa30"} Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.954203 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/905929b3-4296-4ec7-9532-d7ad6e80ad84-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.954288 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpgf7\" (UniqueName: \"kubernetes.io/projected/905929b3-4296-4ec7-9532-d7ad6e80ad84-kube-api-access-zpgf7\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.962720 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" event={"ID":"905929b3-4296-4ec7-9532-d7ad6e80ad84","Type":"ContainerDied","Data":"0b48b90c8a80536a66d268ccd8e65ba3aba664036fb8c4b3c7e5e967aca53428"} Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.962796 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b48b90c8a80536a66d268ccd8e65ba3aba664036fb8c4b3c7e5e967aca53428" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.962847 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-38a1-account-create-update-csb2n" Mar 19 09:49:29 crc kubenswrapper[4835]: I0319 09:49:29.969602 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48489f77-65d9-4e0d-88f5-951d6928dcb3","Type":"ContainerStarted","Data":"9b2af355bb78696be759f784c15920aaaba38f8a6882059eb5c8750026c6a673"} Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.011405 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.011385479 podStartE2EDuration="7.011385479s" podCreationTimestamp="2026-03-19 09:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:29.989933087 +0000 UTC m=+1624.838531694" watchObservedRunningTime="2026-03-19 09:49:30.011385479 +0000 UTC m=+1624.859984066" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.044450 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.157760 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk5fd\" (UniqueName: \"kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd\") pod \"097405cc-a2f0-4f58-875e-b9433a53bde9\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.157931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts\") pod \"097405cc-a2f0-4f58-875e-b9433a53bde9\" (UID: \"097405cc-a2f0-4f58-875e-b9433a53bde9\") " Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.158445 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "097405cc-a2f0-4f58-875e-b9433a53bde9" (UID: "097405cc-a2f0-4f58-875e-b9433a53bde9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.159078 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/097405cc-a2f0-4f58-875e-b9433a53bde9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.161417 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd" (OuterVolumeSpecName: "kube-api-access-rk5fd") pod "097405cc-a2f0-4f58-875e-b9433a53bde9" (UID: "097405cc-a2f0-4f58-875e-b9433a53bde9"). InnerVolumeSpecName "kube-api-access-rk5fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.261243 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk5fd\" (UniqueName: \"kubernetes.io/projected/097405cc-a2f0-4f58-875e-b9433a53bde9-kube-api-access-rk5fd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.983834 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kxp5r" Mar 19 09:49:30 crc kubenswrapper[4835]: I0319 09:49:30.985306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff2e9785-1044-4469-a034-9baeb46ff607","Type":"ContainerStarted","Data":"a5ef4f48f4c99e06206c6764663f822a78925806be8cd2fc849adc5e4897ec94"} Mar 19 09:49:31 crc kubenswrapper[4835]: I0319 09:49:31.014435 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.014406911 podStartE2EDuration="5.014406911s" podCreationTimestamp="2026-03-19 09:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:31.00688474 +0000 UTC m=+1625.855483357" watchObservedRunningTime="2026-03-19 09:49:31.014406911 +0000 UTC m=+1625.863005498" Mar 19 09:49:32 crc kubenswrapper[4835]: I0319 09:49:32.402230 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:49:32 crc kubenswrapper[4835]: E0319 09:49:32.402942 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:33 crc kubenswrapper[4835]: I0319 09:49:33.209682 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:49:33 crc kubenswrapper[4835]: I0319 09:49:33.279866 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:49:33 crc kubenswrapper[4835]: I0319 09:49:33.454352 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.061131 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qfttg"] Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062029 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f917b1a5-922f-444e-b621-b7a89e8df050" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062052 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f917b1a5-922f-444e-b621-b7a89e8df050" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062080 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062088 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062102 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="755c8e66-e5e7-491b-9d77-a4b702ef6b92" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062110 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="755c8e66-e5e7-491b-9d77-a4b702ef6b92" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062129 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097405cc-a2f0-4f58-875e-b9433a53bde9" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062139 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="097405cc-a2f0-4f58-875e-b9433a53bde9" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062165 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d11e7d-9d08-4f99-b447-0a80feec9c95" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062172 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d11e7d-9d08-4f99-b447-0a80feec9c95" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: E0319 09:49:34.062194 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905929b3-4296-4ec7-9532-d7ad6e80ad84" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062203 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="905929b3-4296-4ec7-9532-d7ad6e80ad84" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062446 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d11e7d-9d08-4f99-b447-0a80feec9c95" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062470 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="097405cc-a2f0-4f58-875e-b9433a53bde9" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062489 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="905929b3-4296-4ec7-9532-d7ad6e80ad84" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062504 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" containerName="mariadb-account-create-update" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062527 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="755c8e66-e5e7-491b-9d77-a4b702ef6b92" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.062544 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f917b1a5-922f-444e-b621-b7a89e8df050" containerName="mariadb-database-create" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.063550 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.069729 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.069942 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6dxmm" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.070228 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.078168 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qfttg"] Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.160137 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.160281 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.160377 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwkb\" (UniqueName: \"kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.160533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.262346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwkb\" (UniqueName: \"kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.262502 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.262571 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.262660 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.269440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.270202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.273392 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.294456 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwkb\" (UniqueName: \"kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb\") pod \"nova-cell0-conductor-db-sync-qfttg\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.399884 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.425786 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.425825 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.457883 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.469456 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 09:49:34 crc kubenswrapper[4835]: I0319 09:49:34.999107 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qfttg"] Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.028541 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qfttg" event={"ID":"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda","Type":"ContainerStarted","Data":"d3be4a6f45d0194627a14761d8da9d7cbc2548ed63f4b09dee3f84dba4a01135"} Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.028786 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xf267" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" containerID="cri-o://093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3" gracePeriod=2 Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.029192 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.030066 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.572769 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.702856 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj9qr\" (UniqueName: \"kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr\") pod \"14dd8f89-f4d8-4618-b417-f1a802f3517d\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.703488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content\") pod \"14dd8f89-f4d8-4618-b417-f1a802f3517d\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.703515 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities\") pod \"14dd8f89-f4d8-4618-b417-f1a802f3517d\" (UID: \"14dd8f89-f4d8-4618-b417-f1a802f3517d\") " Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.704111 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities" (OuterVolumeSpecName: "utilities") pod "14dd8f89-f4d8-4618-b417-f1a802f3517d" (UID: "14dd8f89-f4d8-4618-b417-f1a802f3517d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.704316 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.725709 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr" (OuterVolumeSpecName: "kube-api-access-pj9qr") pod "14dd8f89-f4d8-4618-b417-f1a802f3517d" (UID: "14dd8f89-f4d8-4618-b417-f1a802f3517d"). InnerVolumeSpecName "kube-api-access-pj9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.819780 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj9qr\" (UniqueName: \"kubernetes.io/projected/14dd8f89-f4d8-4618-b417-f1a802f3517d-kube-api-access-pj9qr\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.882949 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14dd8f89-f4d8-4618-b417-f1a802f3517d" (UID: "14dd8f89-f4d8-4618-b417-f1a802f3517d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:35 crc kubenswrapper[4835]: I0319 09:49:35.922130 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dd8f89-f4d8-4618-b417-f1a802f3517d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.087522 4835 generic.go:334] "Generic (PLEG): container finished" podID="8613286f-5060-408e-bd42-882380ee4531" containerID="e8a57d5295d0b06faf4fa322e20bb248db0a809a95e67febff392ee2c29f6787" exitCode=0 Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.087705 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerDied","Data":"e8a57d5295d0b06faf4fa322e20bb248db0a809a95e67febff392ee2c29f6787"} Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.095071 4835 generic.go:334] "Generic (PLEG): container finished" podID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerID="093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3" exitCode=0 Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.095838 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xf267" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.095860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerDied","Data":"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3"} Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.095950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xf267" event={"ID":"14dd8f89-f4d8-4618-b417-f1a802f3517d","Type":"ContainerDied","Data":"c01fa43644fe9c912dc990d213caf7cee69a076315a384ef3be92ad3525d60fe"} Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.095991 4835 scope.go:117] "RemoveContainer" containerID="093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.144611 4835 scope.go:117] "RemoveContainer" containerID="faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.148313 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.163437 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xf267"] Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.163507 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.220729 4835 scope.go:117] "RemoveContainer" containerID="245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230775 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230829 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230878 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w67pt\" (UniqueName: \"kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.230985 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.231212 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data\") pod \"8613286f-5060-408e-bd42-882380ee4531\" (UID: \"8613286f-5060-408e-bd42-882380ee4531\") " Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.231871 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.232373 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.233937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.237646 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts" (OuterVolumeSpecName: "scripts") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.237790 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt" (OuterVolumeSpecName: "kube-api-access-w67pt") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "kube-api-access-w67pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.280068 4835 scope.go:117] "RemoveContainer" containerID="093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3" Mar 19 09:49:36 crc kubenswrapper[4835]: E0319 09:49:36.286537 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3\": container with ID starting with 093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3 not found: ID does not exist" containerID="093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.286621 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3"} err="failed to get container status \"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3\": rpc error: code = NotFound desc = could not find container \"093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3\": container with ID starting with 093a0e0f3ac88cc43e95efed6152e26f264f80e51097c1ced1604bf15f67eea3 not found: ID does not exist" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.286652 4835 scope.go:117] "RemoveContainer" containerID="faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b" Mar 19 09:49:36 crc kubenswrapper[4835]: E0319 09:49:36.287357 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b\": container with ID starting with faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b not found: ID does not exist" containerID="faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.287453 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b"} err="failed to get container status \"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b\": rpc error: code = NotFound desc = could not find container \"faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b\": container with ID starting with faf638b3da1d3aa319ffc3728b067d666da4fc5bbf3b3708dab6100b717cd82b not found: ID does not exist" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.287522 4835 scope.go:117] "RemoveContainer" containerID="245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8" Mar 19 09:49:36 crc kubenswrapper[4835]: E0319 09:49:36.294384 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8\": container with ID starting with 245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8 not found: ID does not exist" containerID="245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.294440 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8"} err="failed to get container status \"245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8\": rpc error: code = NotFound desc = could not find container \"245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8\": container with ID starting with 245978a12c18c0619eaf73e3d094eea099dad2716fdc8e8fb3d5e3a0dffe57a8 not found: ID does not exist" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.305764 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.337082 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.337126 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.337142 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w67pt\" (UniqueName: \"kubernetes.io/projected/8613286f-5060-408e-bd42-882380ee4531-kube-api-access-w67pt\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.337156 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8613286f-5060-408e-bd42-882380ee4531-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.391859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.428699 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" path="/var/lib/kubelet/pods/14dd8f89-f4d8-4618-b417-f1a802f3517d/volumes" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.445029 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.448623 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data" (OuterVolumeSpecName: "config-data") pod "8613286f-5060-408e-bd42-882380ee4531" (UID: "8613286f-5060-408e-bd42-882380ee4531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:36 crc kubenswrapper[4835]: I0319 09:49:36.546793 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8613286f-5060-408e-bd42-882380ee4531-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.112168 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8613286f-5060-408e-bd42-882380ee4531","Type":"ContainerDied","Data":"5f24481f59209ed33048f806fbdd65ac9378297e79dce087be9ee0025c771bf7"} Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.112569 4835 scope.go:117] "RemoveContainer" containerID="8e215d67922c00be594b2f7a755500093b9ed0d10cf8a7e9a25864736f641da3" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.112581 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.162474 4835 scope.go:117] "RemoveContainer" containerID="d53e766765cca6556c78ec739eedf608f4f47cd66149a3c21d3d9dc064a880c0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.191302 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.199229 4835 scope.go:117] "RemoveContainer" containerID="0f7950cc20013a190bd603d82491b3bdbe2b96255f9910fa97820beaf5cd2cd1" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.212888 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.227940 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228627 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="sg-core" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228652 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="sg-core" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228678 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="extract-utilities" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="extract-utilities" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228700 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="extract-content" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228707 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="extract-content" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228726 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228733 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228786 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-notification-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228796 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-notification-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228823 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="proxy-httpd" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228831 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="proxy-httpd" Mar 19 09:49:37 crc kubenswrapper[4835]: E0319 09:49:37.228846 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-central-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.228854 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-central-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229114 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-notification-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229137 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="ceilometer-central-agent" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229152 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="proxy-httpd" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229157 4835 scope.go:117] "RemoveContainer" containerID="e8a57d5295d0b06faf4fa322e20bb248db0a809a95e67febff392ee2c29f6787" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229181 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dd8f89-f4d8-4618-b417-f1a802f3517d" containerName="registry-server" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.229320 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8613286f-5060-408e-bd42-882380ee4531" containerName="sg-core" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.231806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.234096 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.234606 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.244920 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.362083 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.362155 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.384678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.384727 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.384940 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.384995 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.385049 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.385132 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.385178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgcxs\" (UniqueName: \"kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.405139 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.412057 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.486728 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.486798 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgcxs\" (UniqueName: \"kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.486871 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.486889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.487197 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.487254 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.487309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.487989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.489413 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.496476 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.496940 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.510484 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgcxs\" (UniqueName: \"kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.510523 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.510620 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " pod="openstack/ceilometer-0" Mar 19 09:49:37 crc kubenswrapper[4835]: I0319 09:49:37.553819 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.045272 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.067899 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.118266 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.128172 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerStarted","Data":"5f8f6f7fc199a90c8704b7c5ff6b120e666a08d18e1298567ed12b0ced40ee80"} Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.131557 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.131825 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbxd8" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="registry-server" containerID="cri-o://84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e" gracePeriod=2 Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.132035 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.416376 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8613286f-5060-408e-bd42-882380ee4531" path="/var/lib/kubelet/pods/8613286f-5060-408e-bd42-882380ee4531/volumes" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.861470 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.924367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content\") pod \"0722dbab-2741-445d-8585-04eaf3738c0d\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.924681 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities\") pod \"0722dbab-2741-445d-8585-04eaf3738c0d\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.925046 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntb9\" (UniqueName: \"kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9\") pod \"0722dbab-2741-445d-8585-04eaf3738c0d\" (UID: \"0722dbab-2741-445d-8585-04eaf3738c0d\") " Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.926004 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities" (OuterVolumeSpecName: "utilities") pod "0722dbab-2741-445d-8585-04eaf3738c0d" (UID: "0722dbab-2741-445d-8585-04eaf3738c0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.931685 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9" (OuterVolumeSpecName: "kube-api-access-nntb9") pod "0722dbab-2741-445d-8585-04eaf3738c0d" (UID: "0722dbab-2741-445d-8585-04eaf3738c0d"). InnerVolumeSpecName "kube-api-access-nntb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:38 crc kubenswrapper[4835]: I0319 09:49:38.977231 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0722dbab-2741-445d-8585-04eaf3738c0d" (UID: "0722dbab-2741-445d-8585-04eaf3738c0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.025126 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.025274 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.026849 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.027178 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nntb9\" (UniqueName: \"kubernetes.io/projected/0722dbab-2741-445d-8585-04eaf3738c0d-kube-api-access-nntb9\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.027208 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.027218 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0722dbab-2741-445d-8585-04eaf3738c0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.157171 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerStarted","Data":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.169435 4835 generic.go:334] "Generic (PLEG): container finished" podID="0722dbab-2741-445d-8585-04eaf3738c0d" containerID="84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e" exitCode=0 Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.169507 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbxd8" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.169524 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerDied","Data":"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e"} Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.170279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbxd8" event={"ID":"0722dbab-2741-445d-8585-04eaf3738c0d","Type":"ContainerDied","Data":"1d4e761a01920eae9082e14e20b564baa5580af6ad598017d6ee989319921941"} Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.170316 4835 scope.go:117] "RemoveContainer" containerID="84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e" Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.255242 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:39 crc kubenswrapper[4835]: I0319 09:49:39.300129 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbxd8"] Mar 19 09:49:40 crc kubenswrapper[4835]: I0319 09:49:40.179035 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:49:40 crc kubenswrapper[4835]: I0319 09:49:40.179073 4835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 09:49:40 crc kubenswrapper[4835]: I0319 09:49:40.416785 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" path="/var/lib/kubelet/pods/0722dbab-2741-445d-8585-04eaf3738c0d/volumes" Mar 19 09:49:40 crc kubenswrapper[4835]: I0319 09:49:40.514661 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:40 crc kubenswrapper[4835]: I0319 09:49:40.514759 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 09:49:43 crc kubenswrapper[4835]: I0319 09:49:43.402971 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:49:43 crc kubenswrapper[4835]: E0319 09:49:43.403592 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.705895 4835 scope.go:117] "RemoveContainer" containerID="a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.780924 4835 scope.go:117] "RemoveContainer" containerID="a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.828927 4835 scope.go:117] "RemoveContainer" containerID="84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e" Mar 19 09:49:44 crc kubenswrapper[4835]: E0319 09:49:44.830489 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e\": container with ID starting with 84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e not found: ID does not exist" containerID="84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.830531 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e"} err="failed to get container status \"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e\": rpc error: code = NotFound desc = could not find container \"84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e\": container with ID starting with 84e6424ea427e17a2fd8692f654e8b3e2d680f8d5cfb5b1f03b45085a09f703e not found: ID does not exist" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.830556 4835 scope.go:117] "RemoveContainer" containerID="a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd" Mar 19 09:49:44 crc kubenswrapper[4835]: E0319 09:49:44.831852 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd\": container with ID starting with a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd not found: ID does not exist" containerID="a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.831886 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd"} err="failed to get container status \"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd\": rpc error: code = NotFound desc = could not find container \"a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd\": container with ID starting with a5b978755ae87a445fa89c43173d1251bf6dceafe546a4dbf3cc4982f81bf7bd not found: ID does not exist" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.831905 4835 scope.go:117] "RemoveContainer" containerID="a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f" Mar 19 09:49:44 crc kubenswrapper[4835]: E0319 09:49:44.832442 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f\": container with ID starting with a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f not found: ID does not exist" containerID="a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f" Mar 19 09:49:44 crc kubenswrapper[4835]: I0319 09:49:44.832465 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f"} err="failed to get container status \"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f\": rpc error: code = NotFound desc = could not find container \"a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f\": container with ID starting with a5eec26c4305c6b97108fcf86d873ecc84ebfb5e256db9dfdf33d4091bd0f83f not found: ID does not exist" Mar 19 09:49:45 crc kubenswrapper[4835]: I0319 09:49:45.242781 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qfttg" event={"ID":"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda","Type":"ContainerStarted","Data":"b273eb566783dbb8ddac7dc0d86f30d341fc5479f6e0d007c37b2fe663cf1f0c"} Mar 19 09:49:45 crc kubenswrapper[4835]: I0319 09:49:45.245441 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerStarted","Data":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} Mar 19 09:49:45 crc kubenswrapper[4835]: I0319 09:49:45.274566 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qfttg" podStartSLOduration=1.465257489 podStartE2EDuration="11.274543101s" podCreationTimestamp="2026-03-19 09:49:34 +0000 UTC" firstStartedPulling="2026-03-19 09:49:34.985909018 +0000 UTC m=+1629.834507605" lastFinishedPulling="2026-03-19 09:49:44.79519463 +0000 UTC m=+1639.643793217" observedRunningTime="2026-03-19 09:49:45.264298048 +0000 UTC m=+1640.112896635" watchObservedRunningTime="2026-03-19 09:49:45.274543101 +0000 UTC m=+1640.123141688" Mar 19 09:49:46 crc kubenswrapper[4835]: I0319 09:49:46.258862 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerStarted","Data":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} Mar 19 09:49:48 crc kubenswrapper[4835]: I0319 09:49:48.297959 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerStarted","Data":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} Mar 19 09:49:48 crc kubenswrapper[4835]: I0319 09:49:48.298760 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:49:48 crc kubenswrapper[4835]: I0319 09:49:48.355431 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.199971374 podStartE2EDuration="11.355402561s" podCreationTimestamp="2026-03-19 09:49:37 +0000 UTC" firstStartedPulling="2026-03-19 09:49:38.076001372 +0000 UTC m=+1632.924599969" lastFinishedPulling="2026-03-19 09:49:47.231432569 +0000 UTC m=+1642.080031156" observedRunningTime="2026-03-19 09:49:48.338283635 +0000 UTC m=+1643.186882262" watchObservedRunningTime="2026-03-19 09:49:48.355402561 +0000 UTC m=+1643.204001168" Mar 19 09:49:52 crc kubenswrapper[4835]: I0319 09:49:52.227177 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:52 crc kubenswrapper[4835]: I0319 09:49:52.227923 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-central-agent" containerID="cri-o://8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" gracePeriod=30 Mar 19 09:49:52 crc kubenswrapper[4835]: I0319 09:49:52.227953 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="sg-core" containerID="cri-o://6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" gracePeriod=30 Mar 19 09:49:52 crc kubenswrapper[4835]: I0319 09:49:52.227932 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="proxy-httpd" containerID="cri-o://a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" gracePeriod=30 Mar 19 09:49:52 crc kubenswrapper[4835]: I0319 09:49:52.228006 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-notification-agent" containerID="cri-o://ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" gracePeriod=30 Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.110208 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.298430 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.298809 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.298874 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgcxs\" (UniqueName: \"kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.299069 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.299138 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.299195 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.299233 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd\") pod \"8fc25787-db79-456d-ab01-93be995791cd\" (UID: \"8fc25787-db79-456d-ab01-93be995791cd\") " Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.299673 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.300092 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.301423 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.301471 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fc25787-db79-456d-ab01-93be995791cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.305009 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts" (OuterVolumeSpecName: "scripts") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.305094 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs" (OuterVolumeSpecName: "kube-api-access-qgcxs") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "kube-api-access-qgcxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.336946 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359200 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fc25787-db79-456d-ab01-93be995791cd" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" exitCode=0 Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359238 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fc25787-db79-456d-ab01-93be995791cd" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" exitCode=2 Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359248 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fc25787-db79-456d-ab01-93be995791cd" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" exitCode=0 Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359258 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fc25787-db79-456d-ab01-93be995791cd" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" exitCode=0 Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerDied","Data":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359294 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359312 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerDied","Data":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359328 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerDied","Data":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359341 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerDied","Data":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359352 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fc25787-db79-456d-ab01-93be995791cd","Type":"ContainerDied","Data":"5f8f6f7fc199a90c8704b7c5ff6b120e666a08d18e1298567ed12b0ced40ee80"} Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.359353 4835 scope.go:117] "RemoveContainer" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.398062 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.404995 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.405024 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.405039 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.405049 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgcxs\" (UniqueName: \"kubernetes.io/projected/8fc25787-db79-456d-ab01-93be995791cd-kube-api-access-qgcxs\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.433086 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data" (OuterVolumeSpecName: "config-data") pod "8fc25787-db79-456d-ab01-93be995791cd" (UID: "8fc25787-db79-456d-ab01-93be995791cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.507271 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fc25787-db79-456d-ab01-93be995791cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.554357 4835 scope.go:117] "RemoveContainer" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.576286 4835 scope.go:117] "RemoveContainer" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.603361 4835 scope.go:117] "RemoveContainer" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.629915 4835 scope.go:117] "RemoveContainer" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.630485 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": container with ID starting with a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049 not found: ID does not exist" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.630524 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} err="failed to get container status \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": rpc error: code = NotFound desc = could not find container \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": container with ID starting with a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.630554 4835 scope.go:117] "RemoveContainer" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.631525 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": container with ID starting with 6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e not found: ID does not exist" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.631558 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} err="failed to get container status \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": rpc error: code = NotFound desc = could not find container \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": container with ID starting with 6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.631576 4835 scope.go:117] "RemoveContainer" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.631908 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": container with ID starting with ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996 not found: ID does not exist" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.631935 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} err="failed to get container status \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": rpc error: code = NotFound desc = could not find container \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": container with ID starting with ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.631948 4835 scope.go:117] "RemoveContainer" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.632558 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": container with ID starting with 8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6 not found: ID does not exist" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.632580 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} err="failed to get container status \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": rpc error: code = NotFound desc = could not find container \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": container with ID starting with 8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.632596 4835 scope.go:117] "RemoveContainer" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.633108 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} err="failed to get container status \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": rpc error: code = NotFound desc = could not find container \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": container with ID starting with a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.633126 4835 scope.go:117] "RemoveContainer" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.633404 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} err="failed to get container status \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": rpc error: code = NotFound desc = could not find container \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": container with ID starting with 6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.633425 4835 scope.go:117] "RemoveContainer" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.636363 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} err="failed to get container status \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": rpc error: code = NotFound desc = could not find container \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": container with ID starting with ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.636419 4835 scope.go:117] "RemoveContainer" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.636910 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} err="failed to get container status \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": rpc error: code = NotFound desc = could not find container \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": container with ID starting with 8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.636937 4835 scope.go:117] "RemoveContainer" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.637244 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} err="failed to get container status \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": rpc error: code = NotFound desc = could not find container \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": container with ID starting with a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.637289 4835 scope.go:117] "RemoveContainer" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.638203 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} err="failed to get container status \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": rpc error: code = NotFound desc = could not find container \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": container with ID starting with 6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.638226 4835 scope.go:117] "RemoveContainer" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.639608 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} err="failed to get container status \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": rpc error: code = NotFound desc = could not find container \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": container with ID starting with ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.639633 4835 scope.go:117] "RemoveContainer" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.640683 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} err="failed to get container status \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": rpc error: code = NotFound desc = could not find container \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": container with ID starting with 8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.640704 4835 scope.go:117] "RemoveContainer" containerID="a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641049 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049"} err="failed to get container status \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": rpc error: code = NotFound desc = could not find container \"a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049\": container with ID starting with a1718b277f7b11a044e80c9fcf2c82e9c4a3e5d0bfa0041ee551fb6694713049 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641084 4835 scope.go:117] "RemoveContainer" containerID="6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641333 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e"} err="failed to get container status \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": rpc error: code = NotFound desc = could not find container \"6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e\": container with ID starting with 6a0d6a80dccde0b9fce495b6fada5fea9f4f1551089f32a8de55479e9047f41e not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641348 4835 scope.go:117] "RemoveContainer" containerID="ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641641 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996"} err="failed to get container status \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": rpc error: code = NotFound desc = could not find container \"ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996\": container with ID starting with ee32aafe31d0cad8fe30769b35038a2e96d7b5bfada158807b7687af8ce48996 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641681 4835 scope.go:117] "RemoveContainer" containerID="8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.641964 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6"} err="failed to get container status \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": rpc error: code = NotFound desc = could not find container \"8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6\": container with ID starting with 8bd27381994e8a8c1cfc58b27cb298b4fc9bdab8c89c87569d475053d317c8a6 not found: ID does not exist" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.736492 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.749385 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.759856 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760571 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="sg-core" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760599 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="sg-core" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760626 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="extract-content" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760635 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="extract-content" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760650 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="registry-server" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760658 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="registry-server" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760668 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-central-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760675 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-central-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760696 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="proxy-httpd" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760704 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="proxy-httpd" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760715 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="extract-utilities" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760724 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="extract-utilities" Mar 19 09:49:53 crc kubenswrapper[4835]: E0319 09:49:53.760778 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-notification-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.760787 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-notification-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.761144 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0722dbab-2741-445d-8585-04eaf3738c0d" containerName="registry-server" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.761169 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-notification-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.761179 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="sg-core" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.761199 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="proxy-httpd" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.761207 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc25787-db79-456d-ab01-93be995791cd" containerName="ceilometer-central-agent" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.763793 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.767372 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.767460 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.775036 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.824493 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x7bn\" (UniqueName: \"kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.824538 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.824576 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.824616 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.824695 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.825178 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.825289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927395 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927854 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x7bn\" (UniqueName: \"kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927885 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927958 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.927975 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.928034 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.928305 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.928947 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.933472 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.934637 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.935302 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.935515 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:53 crc kubenswrapper[4835]: I0319 09:49:53.957303 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x7bn\" (UniqueName: \"kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn\") pod \"ceilometer-0\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " pod="openstack/ceilometer-0" Mar 19 09:49:54 crc kubenswrapper[4835]: I0319 09:49:54.099671 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:49:54 crc kubenswrapper[4835]: I0319 09:49:54.414335 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fc25787-db79-456d-ab01-93be995791cd" path="/var/lib/kubelet/pods/8fc25787-db79-456d-ab01-93be995791cd/volumes" Mar 19 09:49:54 crc kubenswrapper[4835]: I0319 09:49:54.569646 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.045462 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-vx7t2"] Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.046894 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.051392 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2hw\" (UniqueName: \"kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.051544 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.072549 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-vx7t2"] Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.153224 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2hw\" (UniqueName: \"kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.153290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.154313 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.173447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2hw\" (UniqueName: \"kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw\") pod \"aodh-db-create-vx7t2\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.188238 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.268977 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-c643-account-create-update-rvjfw"] Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.270532 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.301125 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.302994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c643-account-create-update-rvjfw"] Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.356764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-875c5\" (UniqueName: \"kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.357289 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.405138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerStarted","Data":"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c"} Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.405173 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerStarted","Data":"cf6fba0d54b2fd585767e6fb2199bc9ff32b23f1f18a6c9387ca521cdc3a5068"} Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.458194 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.458318 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-875c5\" (UniqueName: \"kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.460499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.479489 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-875c5\" (UniqueName: \"kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5\") pod \"aodh-c643-account-create-update-rvjfw\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.623398 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:55 crc kubenswrapper[4835]: I0319 09:49:55.837134 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-vx7t2"] Mar 19 09:49:56 crc kubenswrapper[4835]: I0319 09:49:56.466767 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vx7t2" event={"ID":"c2715f44-163f-4954-aa37-86a59c886a56","Type":"ContainerStarted","Data":"858dcd9b670da439092c4843aa4aeb2a939b0a8b6bb3610e20716707f4f54aa4"} Mar 19 09:49:56 crc kubenswrapper[4835]: I0319 09:49:56.467361 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vx7t2" event={"ID":"c2715f44-163f-4954-aa37-86a59c886a56","Type":"ContainerStarted","Data":"49e5cf1046e31bf6488a09124fed6c2e3312606bf34879d1be9fd4d79e7b473e"} Mar 19 09:49:56 crc kubenswrapper[4835]: I0319 09:49:56.488763 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c643-account-create-update-rvjfw"] Mar 19 09:49:56 crc kubenswrapper[4835]: I0319 09:49:56.490833 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerStarted","Data":"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c"} Mar 19 09:49:56 crc kubenswrapper[4835]: I0319 09:49:56.541822 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-vx7t2" podStartSLOduration=1.541803797 podStartE2EDuration="1.541803797s" podCreationTimestamp="2026-03-19 09:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:49:56.520053907 +0000 UTC m=+1651.368652504" watchObservedRunningTime="2026-03-19 09:49:56.541803797 +0000 UTC m=+1651.390402384" Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.506447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerStarted","Data":"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21"} Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.509090 4835 generic.go:334] "Generic (PLEG): container finished" podID="2740571b-8f0c-4e05-a01a-659337fd2d6e" containerID="91455d98d539f0429600b1823978388be42ffdc40431d43259b6e10024ca66be" exitCode=0 Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.509189 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c643-account-create-update-rvjfw" event={"ID":"2740571b-8f0c-4e05-a01a-659337fd2d6e","Type":"ContainerDied","Data":"91455d98d539f0429600b1823978388be42ffdc40431d43259b6e10024ca66be"} Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.509604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c643-account-create-update-rvjfw" event={"ID":"2740571b-8f0c-4e05-a01a-659337fd2d6e","Type":"ContainerStarted","Data":"777a70063ee214006a1c02c9f75617927736dc35612f45e2488b777212a736fa"} Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.511656 4835 generic.go:334] "Generic (PLEG): container finished" podID="c2715f44-163f-4954-aa37-86a59c886a56" containerID="858dcd9b670da439092c4843aa4aeb2a939b0a8b6bb3610e20716707f4f54aa4" exitCode=0 Mar 19 09:49:57 crc kubenswrapper[4835]: I0319 09:49:57.511700 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vx7t2" event={"ID":"c2715f44-163f-4954-aa37-86a59c886a56","Type":"ContainerDied","Data":"858dcd9b670da439092c4843aa4aeb2a939b0a8b6bb3610e20716707f4f54aa4"} Mar 19 09:49:58 crc kubenswrapper[4835]: I0319 09:49:58.401797 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:49:58 crc kubenswrapper[4835]: E0319 09:49:58.402456 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.112597 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.124540 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.270128 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts\") pod \"c2715f44-163f-4954-aa37-86a59c886a56\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.270212 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts\") pod \"2740571b-8f0c-4e05-a01a-659337fd2d6e\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.270448 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt2hw\" (UniqueName: \"kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw\") pod \"c2715f44-163f-4954-aa37-86a59c886a56\" (UID: \"c2715f44-163f-4954-aa37-86a59c886a56\") " Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.272547 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2715f44-163f-4954-aa37-86a59c886a56" (UID: "c2715f44-163f-4954-aa37-86a59c886a56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.270498 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-875c5\" (UniqueName: \"kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5\") pod \"2740571b-8f0c-4e05-a01a-659337fd2d6e\" (UID: \"2740571b-8f0c-4e05-a01a-659337fd2d6e\") " Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.272908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2740571b-8f0c-4e05-a01a-659337fd2d6e" (UID: "2740571b-8f0c-4e05-a01a-659337fd2d6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.274172 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2715f44-163f-4954-aa37-86a59c886a56-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.274191 4835 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2740571b-8f0c-4e05-a01a-659337fd2d6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.281755 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5" (OuterVolumeSpecName: "kube-api-access-875c5") pod "2740571b-8f0c-4e05-a01a-659337fd2d6e" (UID: "2740571b-8f0c-4e05-a01a-659337fd2d6e"). InnerVolumeSpecName "kube-api-access-875c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.289155 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw" (OuterVolumeSpecName: "kube-api-access-tt2hw") pod "c2715f44-163f-4954-aa37-86a59c886a56" (UID: "c2715f44-163f-4954-aa37-86a59c886a56"). InnerVolumeSpecName "kube-api-access-tt2hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.376916 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt2hw\" (UniqueName: \"kubernetes.io/projected/c2715f44-163f-4954-aa37-86a59c886a56-kube-api-access-tt2hw\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.376967 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-875c5\" (UniqueName: \"kubernetes.io/projected/2740571b-8f0c-4e05-a01a-659337fd2d6e-kube-api-access-875c5\") on node \"crc\" DevicePath \"\"" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.535211 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerStarted","Data":"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8"} Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.535400 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.538416 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c643-account-create-update-rvjfw" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.538988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c643-account-create-update-rvjfw" event={"ID":"2740571b-8f0c-4e05-a01a-659337fd2d6e","Type":"ContainerDied","Data":"777a70063ee214006a1c02c9f75617927736dc35612f45e2488b777212a736fa"} Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.539066 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="777a70063ee214006a1c02c9f75617927736dc35612f45e2488b777212a736fa" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.541004 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-vx7t2" event={"ID":"c2715f44-163f-4954-aa37-86a59c886a56","Type":"ContainerDied","Data":"49e5cf1046e31bf6488a09124fed6c2e3312606bf34879d1be9fd4d79e7b473e"} Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.541142 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e5cf1046e31bf6488a09124fed6c2e3312606bf34879d1be9fd4d79e7b473e" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.541143 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-vx7t2" Mar 19 09:49:59 crc kubenswrapper[4835]: I0319 09:49:59.581412 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.588447676 podStartE2EDuration="6.581390375s" podCreationTimestamp="2026-03-19 09:49:53 +0000 UTC" firstStartedPulling="2026-03-19 09:49:54.569612254 +0000 UTC m=+1649.418210851" lastFinishedPulling="2026-03-19 09:49:58.562554963 +0000 UTC m=+1653.411153550" observedRunningTime="2026-03-19 09:49:59.574578075 +0000 UTC m=+1654.423176652" watchObservedRunningTime="2026-03-19 09:49:59.581390375 +0000 UTC m=+1654.429988972" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.147779 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565230-k576b"] Mar 19 09:50:00 crc kubenswrapper[4835]: E0319 09:50:00.148592 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2740571b-8f0c-4e05-a01a-659337fd2d6e" containerName="mariadb-account-create-update" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.148606 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2740571b-8f0c-4e05-a01a-659337fd2d6e" containerName="mariadb-account-create-update" Mar 19 09:50:00 crc kubenswrapper[4835]: E0319 09:50:00.148654 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2715f44-163f-4954-aa37-86a59c886a56" containerName="mariadb-database-create" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.148660 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2715f44-163f-4954-aa37-86a59c886a56" containerName="mariadb-database-create" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.148876 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2715f44-163f-4954-aa37-86a59c886a56" containerName="mariadb-database-create" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.148905 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2740571b-8f0c-4e05-a01a-659337fd2d6e" containerName="mariadb-account-create-update" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.149717 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.151327 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.151818 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.153784 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.161916 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565230-k576b"] Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.192893 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrb2s\" (UniqueName: \"kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s\") pod \"auto-csr-approver-29565230-k576b\" (UID: \"716e3a02-418e-4d74-a63c-43b2e51ef67e\") " pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.294337 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrb2s\" (UniqueName: \"kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s\") pod \"auto-csr-approver-29565230-k576b\" (UID: \"716e3a02-418e-4d74-a63c-43b2e51ef67e\") " pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.323908 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrb2s\" (UniqueName: \"kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s\") pod \"auto-csr-approver-29565230-k576b\" (UID: \"716e3a02-418e-4d74-a63c-43b2e51ef67e\") " pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.497799 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.549369 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8xr9d"] Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.550933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.553056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.553448 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fxchn" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.553502 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.558292 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.571370 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8xr9d"] Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.587436 4835 generic.go:334] "Generic (PLEG): container finished" podID="2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" containerID="b273eb566783dbb8ddac7dc0d86f30d341fc5479f6e0d007c37b2fe663cf1f0c" exitCode=0 Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.589992 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qfttg" event={"ID":"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda","Type":"ContainerDied","Data":"b273eb566783dbb8ddac7dc0d86f30d341fc5479f6e0d007c37b2fe663cf1f0c"} Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.709110 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.709267 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-625h9\" (UniqueName: \"kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.709307 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.709362 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.813935 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.814211 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-625h9\" (UniqueName: \"kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.814282 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.814373 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.820552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.825713 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.826899 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:00 crc kubenswrapper[4835]: I0319 09:50:00.840480 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-625h9\" (UniqueName: \"kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9\") pod \"aodh-db-sync-8xr9d\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:01 crc kubenswrapper[4835]: I0319 09:50:01.009079 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:01 crc kubenswrapper[4835]: I0319 09:50:01.065510 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565230-k576b"] Mar 19 09:50:01 crc kubenswrapper[4835]: W0319 09:50:01.073084 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod716e3a02_418e_4d74_a63c_43b2e51ef67e.slice/crio-75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6 WatchSource:0}: Error finding container 75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6: Status 404 returned error can't find the container with id 75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6 Mar 19 09:50:01 crc kubenswrapper[4835]: W0319 09:50:01.556239 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode79bbfb4_e971_492f_95b6_fdf2cb5df595.slice/crio-152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca WatchSource:0}: Error finding container 152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca: Status 404 returned error can't find the container with id 152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca Mar 19 09:50:01 crc kubenswrapper[4835]: I0319 09:50:01.559549 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8xr9d"] Mar 19 09:50:01 crc kubenswrapper[4835]: I0319 09:50:01.599968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8xr9d" event={"ID":"e79bbfb4-e971-492f-95b6-fdf2cb5df595","Type":"ContainerStarted","Data":"152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca"} Mar 19 09:50:01 crc kubenswrapper[4835]: I0319 09:50:01.601045 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565230-k576b" event={"ID":"716e3a02-418e-4d74-a63c-43b2e51ef67e","Type":"ContainerStarted","Data":"75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6"} Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.122600 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.250989 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data\") pod \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.251043 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcwkb\" (UniqueName: \"kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb\") pod \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.251082 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts\") pod \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.251103 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle\") pod \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\" (UID: \"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda\") " Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.256491 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts" (OuterVolumeSpecName: "scripts") pod "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" (UID: "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.273899 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb" (OuterVolumeSpecName: "kube-api-access-dcwkb") pod "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" (UID: "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda"). InnerVolumeSpecName "kube-api-access-dcwkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.292513 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data" (OuterVolumeSpecName: "config-data") pod "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" (UID: "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.309959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" (UID: "2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.353429 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.353467 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcwkb\" (UniqueName: \"kubernetes.io/projected/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-kube-api-access-dcwkb\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.353480 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.353487 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.619782 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qfttg" event={"ID":"2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda","Type":"ContainerDied","Data":"d3be4a6f45d0194627a14761d8da9d7cbc2548ed63f4b09dee3f84dba4a01135"} Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.619843 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3be4a6f45d0194627a14761d8da9d7cbc2548ed63f4b09dee3f84dba4a01135" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.619929 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qfttg" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.698309 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:50:02 crc kubenswrapper[4835]: E0319 09:50:02.699122 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" containerName="nova-cell0-conductor-db-sync" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.699152 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" containerName="nova-cell0-conductor-db-sync" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.699440 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" containerName="nova-cell0-conductor-db-sync" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.700462 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.706328 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6dxmm" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.706920 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.721033 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.885706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.885785 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45fs\" (UniqueName: \"kubernetes.io/projected/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-kube-api-access-g45fs\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.885855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.988998 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.989333 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45fs\" (UniqueName: \"kubernetes.io/projected/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-kube-api-access-g45fs\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:02 crc kubenswrapper[4835]: I0319 09:50:02.989368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.008265 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45fs\" (UniqueName: \"kubernetes.io/projected/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-kube-api-access-g45fs\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.027987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.029163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a6fe01-83c8-4e8a-a871-c94a13bc4bd3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3\") " pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.030681 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.592462 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.672445 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565230-k576b" event={"ID":"716e3a02-418e-4d74-a63c-43b2e51ef67e","Type":"ContainerStarted","Data":"68fbf8e477c2c0d0fa051fb2488b6fe60ab3f1e2975d56f66359b4e75d4ccc4f"} Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.675659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3","Type":"ContainerStarted","Data":"f41e2f85aa00f09e5f0d187e941ab1ecda100630c07d49dffc599f679d47e0ca"} Mar 19 09:50:03 crc kubenswrapper[4835]: I0319 09:50:03.700465 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565230-k576b" podStartSLOduration=2.823443735 podStartE2EDuration="3.700446132s" podCreationTimestamp="2026-03-19 09:50:00 +0000 UTC" firstStartedPulling="2026-03-19 09:50:01.076918269 +0000 UTC m=+1655.925516856" lastFinishedPulling="2026-03-19 09:50:01.953920656 +0000 UTC m=+1656.802519253" observedRunningTime="2026-03-19 09:50:03.690028725 +0000 UTC m=+1658.538627312" watchObservedRunningTime="2026-03-19 09:50:03.700446132 +0000 UTC m=+1658.549044719" Mar 19 09:50:04 crc kubenswrapper[4835]: I0319 09:50:04.693964 4835 generic.go:334] "Generic (PLEG): container finished" podID="716e3a02-418e-4d74-a63c-43b2e51ef67e" containerID="68fbf8e477c2c0d0fa051fb2488b6fe60ab3f1e2975d56f66359b4e75d4ccc4f" exitCode=0 Mar 19 09:50:04 crc kubenswrapper[4835]: I0319 09:50:04.694420 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565230-k576b" event={"ID":"716e3a02-418e-4d74-a63c-43b2e51ef67e","Type":"ContainerDied","Data":"68fbf8e477c2c0d0fa051fb2488b6fe60ab3f1e2975d56f66359b4e75d4ccc4f"} Mar 19 09:50:04 crc kubenswrapper[4835]: I0319 09:50:04.697401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"55a6fe01-83c8-4e8a-a871-c94a13bc4bd3","Type":"ContainerStarted","Data":"ac25b4015a1882207df90ab3a145b1d2817a7e7425085f78236412c9fd843261"} Mar 19 09:50:04 crc kubenswrapper[4835]: I0319 09:50:04.697641 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:04 crc kubenswrapper[4835]: I0319 09:50:04.765411 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.765388933 podStartE2EDuration="2.765388933s" podCreationTimestamp="2026-03-19 09:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:04.7450079 +0000 UTC m=+1659.593606497" watchObservedRunningTime="2026-03-19 09:50:04.765388933 +0000 UTC m=+1659.613987520" Mar 19 09:50:06 crc kubenswrapper[4835]: I0319 09:50:06.729625 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565230-k576b" event={"ID":"716e3a02-418e-4d74-a63c-43b2e51ef67e","Type":"ContainerDied","Data":"75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6"} Mar 19 09:50:06 crc kubenswrapper[4835]: I0319 09:50:06.730107 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75b9df116fc9a5c87b30341713f98127dec7876717a5a2770440bd0c4a6481b6" Mar 19 09:50:06 crc kubenswrapper[4835]: I0319 09:50:06.910379 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.112728 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrb2s\" (UniqueName: \"kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s\") pod \"716e3a02-418e-4d74-a63c-43b2e51ef67e\" (UID: \"716e3a02-418e-4d74-a63c-43b2e51ef67e\") " Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.117737 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s" (OuterVolumeSpecName: "kube-api-access-hrb2s") pod "716e3a02-418e-4d74-a63c-43b2e51ef67e" (UID: "716e3a02-418e-4d74-a63c-43b2e51ef67e"). InnerVolumeSpecName "kube-api-access-hrb2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.216117 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrb2s\" (UniqueName: \"kubernetes.io/projected/716e3a02-418e-4d74-a63c-43b2e51ef67e-kube-api-access-hrb2s\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.741847 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8xr9d" event={"ID":"e79bbfb4-e971-492f-95b6-fdf2cb5df595","Type":"ContainerStarted","Data":"47f0dc4adf7815a272f96cd54d5674b6e3cc4b22f40bee83110cdea6a4301556"} Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.741866 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565230-k576b" Mar 19 09:50:07 crc kubenswrapper[4835]: I0319 09:50:07.773334 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8xr9d" podStartSLOduration=2.6042983790000003 podStartE2EDuration="7.773317568s" podCreationTimestamp="2026-03-19 09:50:00 +0000 UTC" firstStartedPulling="2026-03-19 09:50:01.559578648 +0000 UTC m=+1656.408177235" lastFinishedPulling="2026-03-19 09:50:06.728597837 +0000 UTC m=+1661.577196424" observedRunningTime="2026-03-19 09:50:07.760169449 +0000 UTC m=+1662.608768056" watchObservedRunningTime="2026-03-19 09:50:07.773317568 +0000 UTC m=+1662.621916155" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.018625 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565224-v6z65"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.030031 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565224-v6z65"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.064258 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.417319 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9b9bfc-6170-489b-a1b6-702e2904705b" path="/var/lib/kubelet/pods/9e9b9bfc-6170-489b-a1b6-702e2904705b/volumes" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.583114 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-q8bz6"] Mar 19 09:50:08 crc kubenswrapper[4835]: E0319 09:50:08.583599 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716e3a02-418e-4d74-a63c-43b2e51ef67e" containerName="oc" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.583616 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="716e3a02-418e-4d74-a63c-43b2e51ef67e" containerName="oc" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.583826 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="716e3a02-418e-4d74-a63c-43b2e51ef67e" containerName="oc" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.584583 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.589251 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.593493 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.604954 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q8bz6"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.646264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.646555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzk75\" (UniqueName: \"kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.646704 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.646936 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.750434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzk75\" (UniqueName: \"kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.750521 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.750616 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.750723 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.757577 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.764158 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.764887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.783731 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.785618 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzk75\" (UniqueName: \"kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75\") pod \"nova-cell0-cell-mapping-q8bz6\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.793667 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.797633 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.832698 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.882532 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.885333 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.903568 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.907316 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962608 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctgm\" (UniqueName: \"kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962664 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962703 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962788 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962957 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrfd\" (UniqueName: \"kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:08 crc kubenswrapper[4835]: I0319 09:50:08.962984 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctgm\" (UniqueName: \"kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073196 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073242 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073291 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073440 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrfd\" (UniqueName: \"kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073463 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.073541 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.075716 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.089222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.090693 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.094544 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.102633 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.134226 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.161461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctgm\" (UniqueName: \"kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm\") pod \"nova-scheduler-0\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.185379 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrfd\" (UniqueName: \"kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd\") pod \"nova-api-0\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.198423 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.200576 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.202563 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.205262 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.236800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.283236 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.283571 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.283649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.283678 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4mdt\" (UniqueName: \"kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.389063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.389175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.389249 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.389281 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4mdt\" (UniqueName: \"kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.391163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.396649 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.397462 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.428513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.432801 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.434385 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.456329 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.503894 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4mdt\" (UniqueName: \"kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt\") pod \"nova-metadata-0\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.523733 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.529580 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.611009 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.630405 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.630532 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.630611 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwbj\" (UniqueName: \"kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.647515 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.657322 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.732238 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.732320 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vg9\" (UniqueName: \"kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.732369 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.732400 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.733271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.733314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.733340 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.733366 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.733422 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwbj\" (UniqueName: \"kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.736651 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.747833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.761959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwbj\" (UniqueName: \"kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.837384 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.837480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.837552 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.837981 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.838115 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vg9\" (UniqueName: \"kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.838232 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.838488 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.839033 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.839203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.839052 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.840416 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.861651 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:09 crc kubenswrapper[4835]: I0319 09:50:09.889596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vg9\" (UniqueName: \"kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9\") pod \"dnsmasq-dns-9b86998b5-vkfwv\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:10 crc kubenswrapper[4835]: W0319 09:50:10.162601 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod385f4a9d_4a91_4e88_a364_9f54b2a1a1cb.slice/crio-49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05 WatchSource:0}: Error finding container 49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05: Status 404 returned error can't find the container with id 49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05 Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.176218 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.205003 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-q8bz6"] Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.225571 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:10 crc kubenswrapper[4835]: W0319 09:50:10.425929 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc70af041_db0f_4afa_ba0a_bdfee01d5865.slice/crio-497e3d0c15e3184e85a18d212b27f5546840d25a71d91fb3d73d23b550a6d1bc WatchSource:0}: Error finding container 497e3d0c15e3184e85a18d212b27f5546840d25a71d91fb3d73d23b550a6d1bc: Status 404 returned error can't find the container with id 497e3d0c15e3184e85a18d212b27f5546840d25a71d91fb3d73d23b550a6d1bc Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.428416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.446997 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.628436 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.814809 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe","Type":"ContainerStarted","Data":"c6fb972a28216fd76aa3cd1d45f9e4032b164f8043273871e1b38462e563a420"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.817022 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerStarted","Data":"41bebb62b5d792ee24c205c9efc949e0c7276a29d0041936edb0516f641f3174"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.825454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerStarted","Data":"287b756f475fa92802468099f881cf878e73f5c85536f8c33277d8faf4543a24"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.838811 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q8bz6" event={"ID":"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb","Type":"ContainerStarted","Data":"1a4e6f4481934c5abbe4f3763f236ac4089c1257cdc66bfb59cce1c8019e5592"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.838854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q8bz6" event={"ID":"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb","Type":"ContainerStarted","Data":"49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.842074 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70af041-db0f-4afa-ba0a-bdfee01d5865","Type":"ContainerStarted","Data":"497e3d0c15e3184e85a18d212b27f5546840d25a71d91fb3d73d23b550a6d1bc"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.844542 4835 generic.go:334] "Generic (PLEG): container finished" podID="e79bbfb4-e971-492f-95b6-fdf2cb5df595" containerID="47f0dc4adf7815a272f96cd54d5674b6e3cc4b22f40bee83110cdea6a4301556" exitCode=0 Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.844573 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8xr9d" event={"ID":"e79bbfb4-e971-492f-95b6-fdf2cb5df595","Type":"ContainerDied","Data":"47f0dc4adf7815a272f96cd54d5674b6e3cc4b22f40bee83110cdea6a4301556"} Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.859509 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-q8bz6" podStartSLOduration=2.859485528 podStartE2EDuration="2.859485528s" podCreationTimestamp="2026-03-19 09:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:10.855003568 +0000 UTC m=+1665.703602155" watchObservedRunningTime="2026-03-19 09:50:10.859485528 +0000 UTC m=+1665.708084115" Mar 19 09:50:10 crc kubenswrapper[4835]: I0319 09:50:10.951268 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.015828 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d85wk"] Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.017437 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.040422 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.040502 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d85wk"] Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.040711 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.075679 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv9hp\" (UniqueName: \"kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.075836 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.075866 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.075908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.179248 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.181090 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv9hp\" (UniqueName: \"kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.181549 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.181637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.186122 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.187611 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.188782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.200513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv9hp\" (UniqueName: \"kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp\") pod \"nova-cell1-conductor-db-sync-d85wk\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.403663 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:50:11 crc kubenswrapper[4835]: E0319 09:50:11.404021 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.415528 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.864881 4835 generic.go:334] "Generic (PLEG): container finished" podID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerID="cc4c66b25d2e1ef20810f9468b887be6093cc74e2e5ef52d7173dcc87d8342aa" exitCode=0 Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.870345 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" event={"ID":"b139b769-9c57-4ef8-b917-06fc8a1a2aca","Type":"ContainerDied","Data":"cc4c66b25d2e1ef20810f9468b887be6093cc74e2e5ef52d7173dcc87d8342aa"} Mar 19 09:50:11 crc kubenswrapper[4835]: I0319 09:50:11.870386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" event={"ID":"b139b769-9c57-4ef8-b917-06fc8a1a2aca","Type":"ContainerStarted","Data":"c434046edafd05669346b25e2bbbf0c853f32d389c61d268e718f36393e7ece8"} Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.048902 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d85wk"] Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.354144 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.370309 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.452761 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.530517 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-625h9\" (UniqueName: \"kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9\") pod \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.530818 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data\") pod \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.530862 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle\") pod \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.530894 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts\") pod \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\" (UID: \"e79bbfb4-e971-492f-95b6-fdf2cb5df595\") " Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.538387 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts" (OuterVolumeSpecName: "scripts") pod "e79bbfb4-e971-492f-95b6-fdf2cb5df595" (UID: "e79bbfb4-e971-492f-95b6-fdf2cb5df595"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.541898 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9" (OuterVolumeSpecName: "kube-api-access-625h9") pod "e79bbfb4-e971-492f-95b6-fdf2cb5df595" (UID: "e79bbfb4-e971-492f-95b6-fdf2cb5df595"). InnerVolumeSpecName "kube-api-access-625h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.582054 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data" (OuterVolumeSpecName: "config-data") pod "e79bbfb4-e971-492f-95b6-fdf2cb5df595" (UID: "e79bbfb4-e971-492f-95b6-fdf2cb5df595"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.588915 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e79bbfb4-e971-492f-95b6-fdf2cb5df595" (UID: "e79bbfb4-e971-492f-95b6-fdf2cb5df595"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.634095 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.634123 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.634132 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e79bbfb4-e971-492f-95b6-fdf2cb5df595-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.634144 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-625h9\" (UniqueName: \"kubernetes.io/projected/e79bbfb4-e971-492f-95b6-fdf2cb5df595-kube-api-access-625h9\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.889354 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" event={"ID":"b139b769-9c57-4ef8-b917-06fc8a1a2aca","Type":"ContainerStarted","Data":"7fdce34f0b15010cf36afd31b2bf5dccd827fd202633a8b77615e64ee2be71ec"} Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.889750 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.892619 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8xr9d" event={"ID":"e79bbfb4-e971-492f-95b6-fdf2cb5df595","Type":"ContainerDied","Data":"152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca"} Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.892677 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152c451c718a78596827a5718cfb7a98563b5e05473b41f142d934b1d047d3ca" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.892635 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8xr9d" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.894592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d85wk" event={"ID":"bad8f4bf-8deb-49f9-9bbc-a72db22918cd","Type":"ContainerStarted","Data":"9e9893f0a835674377b4ba88abc28010f6657ae93eb955efbc6dc9dcaed5d4a1"} Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.894633 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d85wk" event={"ID":"bad8f4bf-8deb-49f9-9bbc-a72db22918cd","Type":"ContainerStarted","Data":"7f03613b4c68f92714e1f48048ed4ed8fe9a139d9aec3375a0bf15cb723af2ea"} Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.917547 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" podStartSLOduration=3.917521766 podStartE2EDuration="3.917521766s" podCreationTimestamp="2026-03-19 09:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:12.913511209 +0000 UTC m=+1667.762109796" watchObservedRunningTime="2026-03-19 09:50:12.917521766 +0000 UTC m=+1667.766120353" Mar 19 09:50:12 crc kubenswrapper[4835]: I0319 09:50:12.954754 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-d85wk" podStartSLOduration=2.954716656 podStartE2EDuration="2.954716656s" podCreationTimestamp="2026-03-19 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:12.930721937 +0000 UTC m=+1667.779320524" watchObservedRunningTime="2026-03-19 09:50:12.954716656 +0000 UTC m=+1667.803315263" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.315991 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:15 crc kubenswrapper[4835]: E0319 09:50:15.317073 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79bbfb4-e971-492f-95b6-fdf2cb5df595" containerName="aodh-db-sync" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.317096 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79bbfb4-e971-492f-95b6-fdf2cb5df595" containerName="aodh-db-sync" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.317464 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e79bbfb4-e971-492f-95b6-fdf2cb5df595" containerName="aodh-db-sync" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.320066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.323300 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fxchn" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.326159 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.326727 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.338487 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.429309 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.429871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.429983 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7rp\" (UniqueName: \"kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.430130 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.532968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.533013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.533048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7rp\" (UniqueName: \"kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.533106 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.540164 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.540833 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.551216 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.576395 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7rp\" (UniqueName: \"kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp\") pod \"aodh-0\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.654201 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.950661 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerStarted","Data":"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.950992 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerStarted","Data":"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.958509 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70af041-db0f-4afa-ba0a-bdfee01d5865","Type":"ContainerStarted","Data":"9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.971257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe","Type":"ContainerStarted","Data":"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.971338 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053" gracePeriod=30 Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.988572 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.632084775 podStartE2EDuration="7.988545492s" podCreationTimestamp="2026-03-19 09:50:08 +0000 UTC" firstStartedPulling="2026-03-19 09:50:10.240474078 +0000 UTC m=+1665.089072665" lastFinishedPulling="2026-03-19 09:50:14.596934795 +0000 UTC m=+1669.445533382" observedRunningTime="2026-03-19 09:50:15.977921789 +0000 UTC m=+1670.826520396" watchObservedRunningTime="2026-03-19 09:50:15.988545492 +0000 UTC m=+1670.837144079" Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.991821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerStarted","Data":"c5ca8060137070ca053cf2504ed8d9cc0124b296939b5d86952d7ad36a773a78"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.991877 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerStarted","Data":"7126d98e9d76bc7ace8f4948fccf21fde05f8092fe838c315e488cef23a797c8"} Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.992022 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-log" containerID="cri-o://7126d98e9d76bc7ace8f4948fccf21fde05f8092fe838c315e488cef23a797c8" gracePeriod=30 Mar 19 09:50:15 crc kubenswrapper[4835]: I0319 09:50:15.992297 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-metadata" containerID="cri-o://c5ca8060137070ca053cf2504ed8d9cc0124b296939b5d86952d7ad36a773a78" gracePeriod=30 Mar 19 09:50:16 crc kubenswrapper[4835]: I0319 09:50:16.007543 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.118999438 podStartE2EDuration="7.007512487s" podCreationTimestamp="2026-03-19 09:50:09 +0000 UTC" firstStartedPulling="2026-03-19 09:50:10.671717909 +0000 UTC m=+1665.520316496" lastFinishedPulling="2026-03-19 09:50:14.560230938 +0000 UTC m=+1669.408829545" observedRunningTime="2026-03-19 09:50:16.006918162 +0000 UTC m=+1670.855516759" watchObservedRunningTime="2026-03-19 09:50:16.007512487 +0000 UTC m=+1670.856111074" Mar 19 09:50:16 crc kubenswrapper[4835]: I0319 09:50:16.042989 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.911546724 podStartE2EDuration="8.04297019s" podCreationTimestamp="2026-03-19 09:50:08 +0000 UTC" firstStartedPulling="2026-03-19 09:50:10.428288818 +0000 UTC m=+1665.276887405" lastFinishedPulling="2026-03-19 09:50:14.559712284 +0000 UTC m=+1669.408310871" observedRunningTime="2026-03-19 09:50:16.032967595 +0000 UTC m=+1670.881566182" watchObservedRunningTime="2026-03-19 09:50:16.04297019 +0000 UTC m=+1670.891568777" Mar 19 09:50:16 crc kubenswrapper[4835]: I0319 09:50:16.070390 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.966856649 podStartE2EDuration="8.07036307s" podCreationTimestamp="2026-03-19 09:50:08 +0000 UTC" firstStartedPulling="2026-03-19 09:50:10.457400194 +0000 UTC m=+1665.305998781" lastFinishedPulling="2026-03-19 09:50:14.560906625 +0000 UTC m=+1669.409505202" observedRunningTime="2026-03-19 09:50:16.062059899 +0000 UTC m=+1670.910658506" watchObservedRunningTime="2026-03-19 09:50:16.07036307 +0000 UTC m=+1670.918961657" Mar 19 09:50:16 crc kubenswrapper[4835]: I0319 09:50:16.261002 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:16 crc kubenswrapper[4835]: W0319 09:50:16.273444 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fb13de1_3e5d_4eeb_b0f7_d8f4e3cb44d1.slice/crio-c3035c3d797987dc426d823101c7047e77b8a1eef81851b27513d63110a899bb WatchSource:0}: Error finding container c3035c3d797987dc426d823101c7047e77b8a1eef81851b27513d63110a899bb: Status 404 returned error can't find the container with id c3035c3d797987dc426d823101c7047e77b8a1eef81851b27513d63110a899bb Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.044930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerStarted","Data":"c3035c3d797987dc426d823101c7047e77b8a1eef81851b27513d63110a899bb"} Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.061321 4835 generic.go:334] "Generic (PLEG): container finished" podID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerID="c5ca8060137070ca053cf2504ed8d9cc0124b296939b5d86952d7ad36a773a78" exitCode=0 Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.061350 4835 generic.go:334] "Generic (PLEG): container finished" podID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerID="7126d98e9d76bc7ace8f4948fccf21fde05f8092fe838c315e488cef23a797c8" exitCode=143 Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.062402 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerDied","Data":"c5ca8060137070ca053cf2504ed8d9cc0124b296939b5d86952d7ad36a773a78"} Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.062432 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerDied","Data":"7126d98e9d76bc7ace8f4948fccf21fde05f8092fe838c315e488cef23a797c8"} Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.322579 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.507329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle\") pod \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.507410 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data\") pod \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.507441 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4mdt\" (UniqueName: \"kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt\") pod \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.507581 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs\") pod \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\" (UID: \"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4\") " Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.513406 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs" (OuterVolumeSpecName: "logs") pod "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" (UID: "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.519672 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt" (OuterVolumeSpecName: "kube-api-access-m4mdt") pod "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" (UID: "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4"). InnerVolumeSpecName "kube-api-access-m4mdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.542806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data" (OuterVolumeSpecName: "config-data") pod "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" (UID: "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.554319 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" (UID: "6e09cb38-2cc5-45ca-9a6b-6950ebef13c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.610941 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.611221 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.611314 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4mdt\" (UniqueName: \"kubernetes.io/projected/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-kube-api-access-m4mdt\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:17 crc kubenswrapper[4835]: I0319 09:50:17.611403 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.080131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerStarted","Data":"adc5d7c2b66c221d0c916644b0727b87690fa775f05fa48bd92ba903f0b9aabf"} Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.083429 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6e09cb38-2cc5-45ca-9a6b-6950ebef13c4","Type":"ContainerDied","Data":"41bebb62b5d792ee24c205c9efc949e0c7276a29d0041936edb0516f641f3174"} Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.083474 4835 scope.go:117] "RemoveContainer" containerID="c5ca8060137070ca053cf2504ed8d9cc0124b296939b5d86952d7ad36a773a78" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.083635 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.130318 4835 scope.go:117] "RemoveContainer" containerID="7126d98e9d76bc7ace8f4948fccf21fde05f8092fe838c315e488cef23a797c8" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.130452 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.143831 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.159028 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: E0319 09:50:18.159679 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-metadata" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.159699 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-metadata" Mar 19 09:50:18 crc kubenswrapper[4835]: E0319 09:50:18.159867 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-log" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.159878 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-log" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.160105 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-metadata" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.160124 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" containerName="nova-metadata-log" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.161630 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.164944 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.165231 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.205061 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.247594 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.248359 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.248401 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m94fr\" (UniqueName: \"kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.248599 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.248889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.351307 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.351382 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.351408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m94fr\" (UniqueName: \"kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.351493 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.351608 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.354223 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.356987 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.357766 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.363795 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.379938 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m94fr\" (UniqueName: \"kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr\") pod \"nova-metadata-0\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.418456 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e09cb38-2cc5-45ca-9a6b-6950ebef13c4" path="/var/lib/kubelet/pods/6e09cb38-2cc5-45ca-9a6b-6950ebef13c4/volumes" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.489775 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.619481 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.825898 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.826189 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-central-agent" containerID="cri-o://73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c" gracePeriod=30 Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.826324 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="sg-core" containerID="cri-o://ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21" gracePeriod=30 Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.826371 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-notification-agent" containerID="cri-o://e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c" gracePeriod=30 Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.826479 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="proxy-httpd" containerID="cri-o://7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8" gracePeriod=30 Mar 19 09:50:18 crc kubenswrapper[4835]: I0319 09:50:18.855869 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.126950 4835 generic.go:334] "Generic (PLEG): container finished" podID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerID="ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21" exitCode=2 Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.127318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerDied","Data":"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21"} Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.205288 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.205332 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.237115 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.239964 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.302843 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:50:19 crc kubenswrapper[4835]: E0319 09:50:19.486509 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cb07c1b_0e6d_4511_ad45_910d6184955f.slice/crio-7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:50:19 crc kubenswrapper[4835]: W0319 09:50:19.676178 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f179ccd_e0de_49b0_a709_150a668680ff.slice/crio-e421cb9ffe8d7dca5aa3c55c9169bf29949abf0d37a33f7d4feb9f311eaf6413 WatchSource:0}: Error finding container e421cb9ffe8d7dca5aa3c55c9169bf29949abf0d37a33f7d4feb9f311eaf6413: Status 404 returned error can't find the container with id e421cb9ffe8d7dca5aa3c55c9169bf29949abf0d37a33f7d4feb9f311eaf6413 Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.676248 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:19 crc kubenswrapper[4835]: I0319 09:50:19.862363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.167811 4835 generic.go:334] "Generic (PLEG): container finished" podID="385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" containerID="1a4e6f4481934c5abbe4f3763f236ac4089c1257cdc66bfb59cce1c8019e5592" exitCode=0 Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.168269 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q8bz6" event={"ID":"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb","Type":"ContainerDied","Data":"1a4e6f4481934c5abbe4f3763f236ac4089c1257cdc66bfb59cce1c8019e5592"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.176028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerStarted","Data":"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.176097 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerStarted","Data":"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.176110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerStarted","Data":"e421cb9ffe8d7dca5aa3c55c9169bf29949abf0d37a33f7d4feb9f311eaf6413"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.181268 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.189214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerStarted","Data":"16f4ed4345a8836d91b8d99c7107598c9e21972bd841c7dd8a1f17bd53fdeacd"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.197152 4835 generic.go:334] "Generic (PLEG): container finished" podID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerID="7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8" exitCode=0 Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.197176 4835 generic.go:334] "Generic (PLEG): container finished" podID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerID="73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c" exitCode=0 Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.197613 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerDied","Data":"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.197687 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerDied","Data":"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c"} Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.237577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.240913 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.240888017 podStartE2EDuration="2.240888017s" podCreationTimestamp="2026-03-19 09:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:20.209890272 +0000 UTC m=+1675.058488899" watchObservedRunningTime="2026-03-19 09:50:20.240888017 +0000 UTC m=+1675.089486614" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.247910 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.251:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.290141 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.251:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.340857 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:50:20 crc kubenswrapper[4835]: I0319 09:50:20.342698 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="dnsmasq-dns" containerID="cri-o://a5a235769b9aae535cffdbfea40679b6c14b38ffc77ae5005851256d1d79503e" gracePeriod=10 Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.211234 4835 generic.go:334] "Generic (PLEG): container finished" podID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerID="a5a235769b9aae535cffdbfea40679b6c14b38ffc77ae5005851256d1d79503e" exitCode=0 Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.211325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" event={"ID":"cd860306-3e91-4d1d-ae71-cb67e787d22a","Type":"ContainerDied","Data":"a5a235769b9aae535cffdbfea40679b6c14b38ffc77ae5005851256d1d79503e"} Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.212306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" event={"ID":"cd860306-3e91-4d1d-ae71-cb67e787d22a","Type":"ContainerDied","Data":"21781bc77fb5fba98c89dd052659f5fe5fd3dee4355c07044538e34155320acc"} Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.212343 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21781bc77fb5fba98c89dd052659f5fe5fd3dee4355c07044538e34155320acc" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.268183 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425377 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r8j8\" (UniqueName: \"kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425557 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425592 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425619 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.425794 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb\") pod \"cd860306-3e91-4d1d-ae71-cb67e787d22a\" (UID: \"cd860306-3e91-4d1d-ae71-cb67e787d22a\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.474930 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8" (OuterVolumeSpecName: "kube-api-access-8r8j8") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "kube-api-access-8r8j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.530733 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r8j8\" (UniqueName: \"kubernetes.io/projected/cd860306-3e91-4d1d-ae71-cb67e787d22a-kube-api-access-8r8j8\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.576482 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.603508 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.634882 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.634914 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.640382 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.666031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.671961 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config" (OuterVolumeSpecName: "config") pod "cd860306-3e91-4d1d-ae71-cb67e787d22a" (UID: "cd860306-3e91-4d1d-ae71-cb67e787d22a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.710459 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.739825 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.739866 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.739876 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cd860306-3e91-4d1d-ae71-cb67e787d22a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.841133 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzk75\" (UniqueName: \"kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75\") pod \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.841187 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts\") pod \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.841337 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data\") pod \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.841622 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle\") pod \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\" (UID: \"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb\") " Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.846205 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75" (OuterVolumeSpecName: "kube-api-access-bzk75") pod "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" (UID: "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb"). InnerVolumeSpecName "kube-api-access-bzk75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.848466 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts" (OuterVolumeSpecName: "scripts") pod "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" (UID: "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.875823 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data" (OuterVolumeSpecName: "config-data") pod "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" (UID: "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.875884 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" (UID: "385f4a9d-4a91-4e88-a364-9f54b2a1a1cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.944768 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.944802 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.944816 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzk75\" (UniqueName: \"kubernetes.io/projected/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-kube-api-access-bzk75\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:21 crc kubenswrapper[4835]: I0319 09:50:21.944826 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.231668 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-9b8ks" Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.243399 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-q8bz6" Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.244987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-q8bz6" event={"ID":"385f4a9d-4a91-4e88-a364-9f54b2a1a1cb","Type":"ContainerDied","Data":"49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05"} Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.245039 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d9026973e8225b142432df096866c66de49d8451377a3de3f9f94b8c638b05" Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.281941 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.294471 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-9b8ks"] Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.366811 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.367101 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-log" containerID="cri-o://99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237" gracePeriod=30 Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.367375 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-api" containerID="cri-o://5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c" gracePeriod=30 Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.387323 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.402283 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.403012 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-log" containerID="cri-o://aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" gracePeriod=30 Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.403205 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-metadata" containerID="cri-o://33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" gracePeriod=30 Mar 19 09:50:22 crc kubenswrapper[4835]: I0319 09:50:22.432529 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" path="/var/lib/kubelet/pods/cd860306-3e91-4d1d-ae71-cb67e787d22a/volumes" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.214547 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.277185 4835 generic.go:334] "Generic (PLEG): container finished" podID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerID="99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237" exitCode=143 Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.277501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerDied","Data":"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237"} Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.302896 4835 generic.go:334] "Generic (PLEG): container finished" podID="7f179ccd-e0de-49b0-a709-150a668680ff" containerID="33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" exitCode=0 Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.302938 4835 generic.go:334] "Generic (PLEG): container finished" podID="7f179ccd-e0de-49b0-a709-150a668680ff" containerID="aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" exitCode=143 Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.303035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerDied","Data":"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3"} Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.303060 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.303082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerDied","Data":"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60"} Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.303098 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f179ccd-e0de-49b0-a709-150a668680ff","Type":"ContainerDied","Data":"e421cb9ffe8d7dca5aa3c55c9169bf29949abf0d37a33f7d4feb9f311eaf6413"} Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.303117 4835 scope.go:117] "RemoveContainer" containerID="33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.316376 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerName="nova-scheduler-scheduler" containerID="cri-o://9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" gracePeriod=30 Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.316730 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerStarted","Data":"da7defb6a8d0db3b519ca0428186dd8b848d699da13a2f7660ea692b54bdf642"} Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.345496 4835 scope.go:117] "RemoveContainer" containerID="aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.381574 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle\") pod \"7f179ccd-e0de-49b0-a709-150a668680ff\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.381698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m94fr\" (UniqueName: \"kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr\") pod \"7f179ccd-e0de-49b0-a709-150a668680ff\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.381810 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs\") pod \"7f179ccd-e0de-49b0-a709-150a668680ff\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.381973 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data\") pod \"7f179ccd-e0de-49b0-a709-150a668680ff\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.382063 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs\") pod \"7f179ccd-e0de-49b0-a709-150a668680ff\" (UID: \"7f179ccd-e0de-49b0-a709-150a668680ff\") " Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.382829 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs" (OuterVolumeSpecName: "logs") pod "7f179ccd-e0de-49b0-a709-150a668680ff" (UID: "7f179ccd-e0de-49b0-a709-150a668680ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.390031 4835 scope.go:117] "RemoveContainer" containerID="33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.402650 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.403328 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.405390 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3\": container with ID starting with 33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3 not found: ID does not exist" containerID="33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.405431 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3"} err="failed to get container status \"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3\": rpc error: code = NotFound desc = could not find container \"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3\": container with ID starting with 33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3 not found: ID does not exist" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.405454 4835 scope.go:117] "RemoveContainer" containerID="aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.406116 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60\": container with ID starting with aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60 not found: ID does not exist" containerID="aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.406150 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60"} err="failed to get container status \"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60\": rpc error: code = NotFound desc = could not find container \"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60\": container with ID starting with aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60 not found: ID does not exist" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.406167 4835 scope.go:117] "RemoveContainer" containerID="33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.406441 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3"} err="failed to get container status \"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3\": rpc error: code = NotFound desc = could not find container \"33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3\": container with ID starting with 33335a25cf5e9262273d0e578f8ac2bf7166bb191eedff3d6cd126adeb1df8d3 not found: ID does not exist" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.406462 4835 scope.go:117] "RemoveContainer" containerID="aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.406674 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60"} err="failed to get container status \"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60\": rpc error: code = NotFound desc = could not find container \"aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60\": container with ID starting with aa0b128f0d527be6e56aff04258690a3148913f715a3cd14958b6dbf3c31bc60 not found: ID does not exist" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.407261 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr" (OuterVolumeSpecName: "kube-api-access-m94fr") pod "7f179ccd-e0de-49b0-a709-150a668680ff" (UID: "7f179ccd-e0de-49b0-a709-150a668680ff"). InnerVolumeSpecName "kube-api-access-m94fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.417080 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f179ccd-e0de-49b0-a709-150a668680ff" (UID: "7f179ccd-e0de-49b0-a709-150a668680ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.451940 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data" (OuterVolumeSpecName: "config-data") pod "7f179ccd-e0de-49b0-a709-150a668680ff" (UID: "7f179ccd-e0de-49b0-a709-150a668680ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.485445 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.485660 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m94fr\" (UniqueName: \"kubernetes.io/projected/7f179ccd-e0de-49b0-a709-150a668680ff-kube-api-access-m94fr\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.485733 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f179ccd-e0de-49b0-a709-150a668680ff-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.485827 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.499025 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7f179ccd-e0de-49b0-a709-150a668680ff" (UID: "7f179ccd-e0de-49b0-a709-150a668680ff"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.589961 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f179ccd-e0de-49b0-a709-150a668680ff-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.730415 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.789952 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809203 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.809673 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="dnsmasq-dns" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809685 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="dnsmasq-dns" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.809703 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-metadata" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809710 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-metadata" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.809776 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-log" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809784 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-log" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.809792 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="init" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809797 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="init" Mar 19 09:50:23 crc kubenswrapper[4835]: E0319 09:50:23.809814 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" containerName="nova-manage" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.809819 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" containerName="nova-manage" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.810014 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" containerName="nova-manage" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.810034 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-metadata" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.810049 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd860306-3e91-4d1d-ae71-cb67e787d22a" containerName="dnsmasq-dns" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.810058 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" containerName="nova-metadata-log" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.811953 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.817040 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.817196 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:50:23 crc kubenswrapper[4835]: I0319 09:50:23.823531 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.001896 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.002065 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.002090 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fbws\" (UniqueName: \"kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.002118 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.002193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.103956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.104059 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.104188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.104214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbws\" (UniqueName: \"kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.104235 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.104705 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.110048 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.110129 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.110583 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.127591 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fbws\" (UniqueName: \"kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws\") pod \"nova-metadata-0\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.150402 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.237919 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db is running failed: container process not found" containerID="9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.239009 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db is running failed: container process not found" containerID="9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.239822 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db is running failed: container process not found" containerID="9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.239855 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerName="nova-scheduler-scheduler" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.316416 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.350260 4835 generic.go:334] "Generic (PLEG): container finished" podID="bad8f4bf-8deb-49f9-9bbc-a72db22918cd" containerID="9e9893f0a835674377b4ba88abc28010f6657ae93eb955efbc6dc9dcaed5d4a1" exitCode=0 Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.350324 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d85wk" event={"ID":"bad8f4bf-8deb-49f9-9bbc-a72db22918cd","Type":"ContainerDied","Data":"9e9893f0a835674377b4ba88abc28010f6657ae93eb955efbc6dc9dcaed5d4a1"} Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.360934 4835 generic.go:334] "Generic (PLEG): container finished" podID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerID="e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c" exitCode=0 Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.361216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerDied","Data":"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c"} Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.361262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cb07c1b-0e6d-4511-ad45-910d6184955f","Type":"ContainerDied","Data":"cf6fba0d54b2fd585767e6fb2199bc9ff32b23f1f18a6c9387ca521cdc3a5068"} Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.361281 4835 scope.go:117] "RemoveContainer" containerID="7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.361466 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.370268 4835 generic.go:334] "Generic (PLEG): container finished" podID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerID="9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" exitCode=0 Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.370355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70af041-db0f-4afa-ba0a-bdfee01d5865","Type":"ContainerDied","Data":"9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db"} Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.408026 4835 scope.go:117] "RemoveContainer" containerID="ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.410612 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.410796 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.410854 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.410932 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.410990 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.411101 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.411192 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x7bn\" (UniqueName: \"kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn\") pod \"9cb07c1b-0e6d-4511-ad45-910d6184955f\" (UID: \"9cb07c1b-0e6d-4511-ad45-910d6184955f\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.411886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.412260 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.412731 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.412843 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cb07c1b-0e6d-4511-ad45-910d6184955f-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.422481 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts" (OuterVolumeSpecName: "scripts") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.427672 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn" (OuterVolumeSpecName: "kube-api-access-9x7bn") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "kube-api-access-9x7bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.430371 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f179ccd-e0de-49b0-a709-150a668680ff" path="/var/lib/kubelet/pods/7f179ccd-e0de-49b0-a709-150a668680ff/volumes" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.517202 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.517442 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x7bn\" (UniqueName: \"kubernetes.io/projected/9cb07c1b-0e6d-4511-ad45-910d6184955f-kube-api-access-9x7bn\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.524014 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.572016 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.590778 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.628984 4835 scope.go:117] "RemoveContainer" containerID="e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.631862 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.631884 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.645821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data" (OuterVolumeSpecName: "config-data") pod "9cb07c1b-0e6d-4511-ad45-910d6184955f" (UID: "9cb07c1b-0e6d-4511-ad45-910d6184955f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.678345 4835 scope.go:117] "RemoveContainer" containerID="73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.737051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle\") pod \"c70af041-db0f-4afa-ba0a-bdfee01d5865\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.737214 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data\") pod \"c70af041-db0f-4afa-ba0a-bdfee01d5865\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.737314 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hctgm\" (UniqueName: \"kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm\") pod \"c70af041-db0f-4afa-ba0a-bdfee01d5865\" (UID: \"c70af041-db0f-4afa-ba0a-bdfee01d5865\") " Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.737970 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cb07c1b-0e6d-4511-ad45-910d6184955f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.749610 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm" (OuterVolumeSpecName: "kube-api-access-hctgm") pod "c70af041-db0f-4afa-ba0a-bdfee01d5865" (UID: "c70af041-db0f-4afa-ba0a-bdfee01d5865"). InnerVolumeSpecName "kube-api-access-hctgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.759297 4835 scope.go:117] "RemoveContainer" containerID="7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.759852 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8\": container with ID starting with 7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8 not found: ID does not exist" containerID="7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.759897 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8"} err="failed to get container status \"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8\": rpc error: code = NotFound desc = could not find container \"7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8\": container with ID starting with 7c85e958887dc67c722623ad2e7b668df81a1b3e7371f31cc1201e8b70a197f8 not found: ID does not exist" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.759920 4835 scope.go:117] "RemoveContainer" containerID="ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.760663 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21\": container with ID starting with ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21 not found: ID does not exist" containerID="ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.760681 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21"} err="failed to get container status \"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21\": rpc error: code = NotFound desc = could not find container \"ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21\": container with ID starting with ea229e32dd4141b273c3bc2060b1b86866a3b363b7a2e4344155ec3663d0ca21 not found: ID does not exist" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.760710 4835 scope.go:117] "RemoveContainer" containerID="e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.760774 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.761664 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c\": container with ID starting with e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c not found: ID does not exist" containerID="e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.761684 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c"} err="failed to get container status \"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c\": rpc error: code = NotFound desc = could not find container \"e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c\": container with ID starting with e14b92e6568b49b4097120376ff652cdcd9d4915b576e308b6a0139c21f67f7c not found: ID does not exist" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.761700 4835 scope.go:117] "RemoveContainer" containerID="73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.761920 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c\": container with ID starting with 73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c not found: ID does not exist" containerID="73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.761941 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c"} err="failed to get container status \"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c\": rpc error: code = NotFound desc = could not find container \"73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c\": container with ID starting with 73e272be769b264df39b2b96fc79c138a15d5f5ef348307c8ca2543e4e51218c not found: ID does not exist" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.776115 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data" (OuterVolumeSpecName: "config-data") pod "c70af041-db0f-4afa-ba0a-bdfee01d5865" (UID: "c70af041-db0f-4afa-ba0a-bdfee01d5865"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.801799 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.809748 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c70af041-db0f-4afa-ba0a-bdfee01d5865" (UID: "c70af041-db0f-4afa-ba0a-bdfee01d5865"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.828792 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.829334 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-central-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829354 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-central-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.829377 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="sg-core" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829383 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="sg-core" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.829402 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="proxy-httpd" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829409 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="proxy-httpd" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.829434 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerName="nova-scheduler-scheduler" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829443 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerName="nova-scheduler-scheduler" Mar 19 09:50:24 crc kubenswrapper[4835]: E0319 09:50:24.829456 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-notification-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829462 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-notification-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829672 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="proxy-httpd" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829702 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-central-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829711 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" containerName="nova-scheduler-scheduler" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829719 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="sg-core" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.829734 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="ceilometer-notification-agent" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.831767 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.834075 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.834247 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.839850 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.839911 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.839933 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb4lt\" (UniqueName: \"kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.839981 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840002 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840029 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840085 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840206 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840218 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c70af041-db0f-4afa-ba0a-bdfee01d5865-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.840229 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hctgm\" (UniqueName: \"kubernetes.io/projected/c70af041-db0f-4afa-ba0a-bdfee01d5865-kube-api-access-hctgm\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.857847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.876834 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942467 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942492 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb4lt\" (UniqueName: \"kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942547 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942568 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.942597 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.943775 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.944099 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.944182 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.949202 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.950412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.953152 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.954234 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:24 crc kubenswrapper[4835]: I0319 09:50:24.963286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb4lt\" (UniqueName: \"kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt\") pod \"ceilometer-0\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " pod="openstack/ceilometer-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.166384 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.386444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c70af041-db0f-4afa-ba0a-bdfee01d5865","Type":"ContainerDied","Data":"497e3d0c15e3184e85a18d212b27f5546840d25a71d91fb3d73d23b550a6d1bc"} Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.386860 4835 scope.go:117] "RemoveContainer" containerID="9c8e99f0eec569cbba76191cdd41fb16c88775efabc919f60260c311a15b31db" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.386607 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.393344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerStarted","Data":"1bc4d60be8da07a45ae1397a34e240c6d48252301fb68ac1491ddfa827c767f1"} Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.393854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerStarted","Data":"7284e6b6fab45bcaa06a3923fd49d73b55785568ea4bacddd0ab41c842ed77e6"} Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.393927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerStarted","Data":"a9a923edcfcf2a1ab8601d82382705749fe34df2c3ebb535cce319d88d6df562"} Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.419111 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.41908284 podStartE2EDuration="2.41908284s" podCreationTimestamp="2026-03-19 09:50:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:25.414210931 +0000 UTC m=+1680.262809518" watchObservedRunningTime="2026-03-19 09:50:25.41908284 +0000 UTC m=+1680.267681427" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.471061 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.490379 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.504612 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.506535 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.511093 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.519717 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.659180 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.659233 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.659399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qbf\" (UniqueName: \"kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.762226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.762278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.762469 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54qbf\" (UniqueName: \"kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.771000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.774682 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.807542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qbf\" (UniqueName: \"kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf\") pod \"nova-scheduler-0\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " pod="openstack/nova-scheduler-0" Mar 19 09:50:25 crc kubenswrapper[4835]: I0319 09:50:25.837276 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.163927 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.275168 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle\") pod \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.275449 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv9hp\" (UniqueName: \"kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp\") pod \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.275580 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts\") pod \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.275652 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data\") pod \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\" (UID: \"bad8f4bf-8deb-49f9-9bbc-a72db22918cd\") " Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.283373 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp" (OuterVolumeSpecName: "kube-api-access-jv9hp") pod "bad8f4bf-8deb-49f9-9bbc-a72db22918cd" (UID: "bad8f4bf-8deb-49f9-9bbc-a72db22918cd"). InnerVolumeSpecName "kube-api-access-jv9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.283689 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts" (OuterVolumeSpecName: "scripts") pod "bad8f4bf-8deb-49f9-9bbc-a72db22918cd" (UID: "bad8f4bf-8deb-49f9-9bbc-a72db22918cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.318672 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data" (OuterVolumeSpecName: "config-data") pod "bad8f4bf-8deb-49f9-9bbc-a72db22918cd" (UID: "bad8f4bf-8deb-49f9-9bbc-a72db22918cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.329017 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bad8f4bf-8deb-49f9-9bbc-a72db22918cd" (UID: "bad8f4bf-8deb-49f9-9bbc-a72db22918cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.329101 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.381818 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv9hp\" (UniqueName: \"kubernetes.io/projected/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-kube-api-access-jv9hp\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.381894 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.381939 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.381957 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bad8f4bf-8deb-49f9-9bbc-a72db22918cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.441816 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" path="/var/lib/kubelet/pods/9cb07c1b-0e6d-4511-ad45-910d6184955f/volumes" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.451511 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70af041-db0f-4afa-ba0a-bdfee01d5865" path="/var/lib/kubelet/pods/c70af041-db0f-4afa-ba0a-bdfee01d5865/volumes" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.453203 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerStarted","Data":"3b699eab2b26f03d5aa71cf89fcc98ca5253b8b9de4dc21fae96bb75faf099f0"} Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.467338 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d85wk" event={"ID":"bad8f4bf-8deb-49f9-9bbc-a72db22918cd","Type":"ContainerDied","Data":"7f03613b4c68f92714e1f48048ed4ed8fe9a139d9aec3375a0bf15cb723af2ea"} Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.467379 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f03613b4c68f92714e1f48048ed4ed8fe9a139d9aec3375a0bf15cb723af2ea" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.467377 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d85wk" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.472948 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-api" containerID="cri-o://adc5d7c2b66c221d0c916644b0727b87690fa775f05fa48bd92ba903f0b9aabf" gracePeriod=30 Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.473199 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerStarted","Data":"c30856a682f79fafcc2499a2795a035e1bce1af1b9a64aa38ab07530a2d46d0b"} Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.475074 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-notifier" containerID="cri-o://da7defb6a8d0db3b519ca0428186dd8b848d699da13a2f7660ea692b54bdf642" gracePeriod=30 Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.475227 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-listener" containerID="cri-o://c30856a682f79fafcc2499a2795a035e1bce1af1b9a64aa38ab07530a2d46d0b" gracePeriod=30 Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.475309 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-evaluator" containerID="cri-o://16f4ed4345a8836d91b8d99c7107598c9e21972bd841c7dd8a1f17bd53fdeacd" gracePeriod=30 Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.526148 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:50:26 crc kubenswrapper[4835]: E0319 09:50:26.526671 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad8f4bf-8deb-49f9-9bbc-a72db22918cd" containerName="nova-cell1-conductor-db-sync" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.526685 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad8f4bf-8deb-49f9-9bbc-a72db22918cd" containerName="nova-cell1-conductor-db-sync" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.526958 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad8f4bf-8deb-49f9-9bbc-a72db22918cd" containerName="nova-cell1-conductor-db-sync" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.527677 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.530320 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.555715 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.557953 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.070104754 podStartE2EDuration="11.557934288s" podCreationTimestamp="2026-03-19 09:50:15 +0000 UTC" firstStartedPulling="2026-03-19 09:50:16.277833453 +0000 UTC m=+1671.126432040" lastFinishedPulling="2026-03-19 09:50:25.765662987 +0000 UTC m=+1680.614261574" observedRunningTime="2026-03-19 09:50:26.515496989 +0000 UTC m=+1681.364095576" watchObservedRunningTime="2026-03-19 09:50:26.557934288 +0000 UTC m=+1681.406532875" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.603503 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.691353 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9j8r\" (UniqueName: \"kubernetes.io/projected/67089e56-05fd-46f8-b595-6ece6f03b14f-kube-api-access-v9j8r\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.691520 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.691602 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.795228 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9j8r\" (UniqueName: \"kubernetes.io/projected/67089e56-05fd-46f8-b595-6ece6f03b14f-kube-api-access-v9j8r\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.795503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.795718 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.801197 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.801774 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67089e56-05fd-46f8-b595-6ece6f03b14f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.846281 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9j8r\" (UniqueName: \"kubernetes.io/projected/67089e56-05fd-46f8-b595-6ece6f03b14f-kube-api-access-v9j8r\") pod \"nova-cell1-conductor-0\" (UID: \"67089e56-05fd-46f8-b595-6ece6f03b14f\") " pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:26 crc kubenswrapper[4835]: I0319 09:50:26.852723 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.202876 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.203276 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.428673 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.493444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6bb0461-be40-4d05-893b-c9b3cc97c134","Type":"ContainerStarted","Data":"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.493483 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6bb0461-be40-4d05-893b-c9b3cc97c134","Type":"ContainerStarted","Data":"2ea6a9fff14e1ee8a78c09af7eaf845529d521b17215cf033968867f7959641d"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498525 4835 generic.go:334] "Generic (PLEG): container finished" podID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerID="da7defb6a8d0db3b519ca0428186dd8b848d699da13a2f7660ea692b54bdf642" exitCode=0 Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498557 4835 generic.go:334] "Generic (PLEG): container finished" podID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerID="16f4ed4345a8836d91b8d99c7107598c9e21972bd841c7dd8a1f17bd53fdeacd" exitCode=0 Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498565 4835 generic.go:334] "Generic (PLEG): container finished" podID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerID="adc5d7c2b66c221d0c916644b0727b87690fa775f05fa48bd92ba903f0b9aabf" exitCode=0 Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498609 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerDied","Data":"da7defb6a8d0db3b519ca0428186dd8b848d699da13a2f7660ea692b54bdf642"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498635 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerDied","Data":"16f4ed4345a8836d91b8d99c7107598c9e21972bd841c7dd8a1f17bd53fdeacd"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.498643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerDied","Data":"adc5d7c2b66c221d0c916644b0727b87690fa775f05fa48bd92ba903f0b9aabf"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.518947 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle\") pod \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.519249 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs\") pod \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.519533 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgrfd\" (UniqueName: \"kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd\") pod \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.519827 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data\") pod \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\" (UID: \"c41cc34a-6fbb-45a1-b39d-309f669e1ce9\") " Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.520255 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs" (OuterVolumeSpecName: "logs") pod "c41cc34a-6fbb-45a1-b39d-309f669e1ce9" (UID: "c41cc34a-6fbb-45a1-b39d-309f669e1ce9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521219 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521509 4835 generic.go:334] "Generic (PLEG): container finished" podID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerID="5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c" exitCode=0 Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521578 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerDied","Data":"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521604 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c41cc34a-6fbb-45a1-b39d-309f669e1ce9","Type":"ContainerDied","Data":"287b756f475fa92802468099f881cf878e73f5c85536f8c33277d8faf4543a24"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521620 4835 scope.go:117] "RemoveContainer" containerID="5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521764 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.521828 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.521708366 podStartE2EDuration="2.521708366s" podCreationTimestamp="2026-03-19 09:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:27.514866474 +0000 UTC m=+1682.363465081" watchObservedRunningTime="2026-03-19 09:50:27.521708366 +0000 UTC m=+1682.370306953" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.528036 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd" (OuterVolumeSpecName: "kube-api-access-dgrfd") pod "c41cc34a-6fbb-45a1-b39d-309f669e1ce9" (UID: "c41cc34a-6fbb-45a1-b39d-309f669e1ce9"). InnerVolumeSpecName "kube-api-access-dgrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.539565 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerStarted","Data":"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e"} Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.586877 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41cc34a-6fbb-45a1-b39d-309f669e1ce9" (UID: "c41cc34a-6fbb-45a1-b39d-309f669e1ce9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.597657 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.622690 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.622718 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgrfd\" (UniqueName: \"kubernetes.io/projected/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-kube-api-access-dgrfd\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.633929 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data" (OuterVolumeSpecName: "config-data") pod "c41cc34a-6fbb-45a1-b39d-309f669e1ce9" (UID: "c41cc34a-6fbb-45a1-b39d-309f669e1ce9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.654716 4835 scope.go:117] "RemoveContainer" containerID="99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.683752 4835 scope.go:117] "RemoveContainer" containerID="5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c" Mar 19 09:50:27 crc kubenswrapper[4835]: E0319 09:50:27.684145 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c\": container with ID starting with 5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c not found: ID does not exist" containerID="5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.684183 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c"} err="failed to get container status \"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c\": rpc error: code = NotFound desc = could not find container \"5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c\": container with ID starting with 5cc2987af60cd4e21558373fdcf7abd7a62c5f61a6e4f7c8d92fdd3ee73a1a6c not found: ID does not exist" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.684208 4835 scope.go:117] "RemoveContainer" containerID="99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237" Mar 19 09:50:27 crc kubenswrapper[4835]: E0319 09:50:27.684613 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237\": container with ID starting with 99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237 not found: ID does not exist" containerID="99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.684647 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237"} err="failed to get container status \"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237\": rpc error: code = NotFound desc = could not find container \"99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237\": container with ID starting with 99312d22ce11a44a903904ff2f210aa4f80d087c0e91f43b338f444120884237 not found: ID does not exist" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.724566 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41cc34a-6fbb-45a1-b39d-309f669e1ce9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.867688 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.881962 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.898471 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:27 crc kubenswrapper[4835]: E0319 09:50:27.899306 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-api" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.899321 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-api" Mar 19 09:50:27 crc kubenswrapper[4835]: E0319 09:50:27.899338 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-log" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.899360 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-log" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.899676 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-api" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.899692 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" containerName="nova-api-log" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.901682 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.904802 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:50:27 crc kubenswrapper[4835]: I0319 09:50:27.938804 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.031590 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6x6m\" (UniqueName: \"kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.031672 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.031760 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.031859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.133956 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.134101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.134290 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.134383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6x6m\" (UniqueName: \"kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.135959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.140553 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.141242 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.156203 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6x6m\" (UniqueName: \"kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m\") pod \"nova-api-0\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.235306 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.448701 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41cc34a-6fbb-45a1-b39d-309f669e1ce9" path="/var/lib/kubelet/pods/c41cc34a-6fbb-45a1-b39d-309f669e1ce9/volumes" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.562643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67089e56-05fd-46f8-b595-6ece6f03b14f","Type":"ContainerStarted","Data":"fde5dc42b61dc08c73bee72917fc1d7529091056a3b003c64cb5d82a73faf31d"} Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.563043 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67089e56-05fd-46f8-b595-6ece6f03b14f","Type":"ContainerStarted","Data":"fccba6a9855259c3d0e1dd47854ec7092eb07336fd277fb3a207695a93b20f54"} Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.563455 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.567822 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerStarted","Data":"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca"} Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.584013 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.583989985 podStartE2EDuration="2.583989985s" podCreationTimestamp="2026-03-19 09:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:28.583724958 +0000 UTC m=+1683.432323545" watchObservedRunningTime="2026-03-19 09:50:28.583989985 +0000 UTC m=+1683.432588572" Mar 19 09:50:28 crc kubenswrapper[4835]: I0319 09:50:28.904034 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:28 crc kubenswrapper[4835]: W0319 09:50:28.911385 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c84b5cc_ce4f_4610_9806_9cec7262787a.slice/crio-3422cfcb2a9f9656cd869e087ad0822d9ddd04266e969c49ed9087fba75dd97e WatchSource:0}: Error finding container 3422cfcb2a9f9656cd869e087ad0822d9ddd04266e969c49ed9087fba75dd97e: Status 404 returned error can't find the container with id 3422cfcb2a9f9656cd869e087ad0822d9ddd04266e969c49ed9087fba75dd97e Mar 19 09:50:29 crc kubenswrapper[4835]: I0319 09:50:29.604675 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerStarted","Data":"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4"} Mar 19 09:50:29 crc kubenswrapper[4835]: I0319 09:50:29.608340 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerStarted","Data":"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6"} Mar 19 09:50:29 crc kubenswrapper[4835]: I0319 09:50:29.608393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerStarted","Data":"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10"} Mar 19 09:50:29 crc kubenswrapper[4835]: I0319 09:50:29.608416 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerStarted","Data":"3422cfcb2a9f9656cd869e087ad0822d9ddd04266e969c49ed9087fba75dd97e"} Mar 19 09:50:29 crc kubenswrapper[4835]: I0319 09:50:29.657252 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.657229587 podStartE2EDuration="2.657229587s" podCreationTimestamp="2026-03-19 09:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:29.631617985 +0000 UTC m=+1684.480216582" watchObservedRunningTime="2026-03-19 09:50:29.657229587 +0000 UTC m=+1684.505828174" Mar 19 09:50:30 crc kubenswrapper[4835]: I0319 09:50:30.838296 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:50:31 crc kubenswrapper[4835]: I0319 09:50:31.635206 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerStarted","Data":"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788"} Mar 19 09:50:31 crc kubenswrapper[4835]: I0319 09:50:31.636214 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:50:31 crc kubenswrapper[4835]: I0319 09:50:31.666509 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.264349584 podStartE2EDuration="7.666484476s" podCreationTimestamp="2026-03-19 09:50:24 +0000 UTC" firstStartedPulling="2026-03-19 09:50:26.328963843 +0000 UTC m=+1681.177562430" lastFinishedPulling="2026-03-19 09:50:30.731098735 +0000 UTC m=+1685.579697322" observedRunningTime="2026-03-19 09:50:31.66286229 +0000 UTC m=+1686.511460907" watchObservedRunningTime="2026-03-19 09:50:31.666484476 +0000 UTC m=+1686.515083063" Mar 19 09:50:34 crc kubenswrapper[4835]: I0319 09:50:34.151459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:50:34 crc kubenswrapper[4835]: I0319 09:50:34.152244 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:50:34 crc kubenswrapper[4835]: I0319 09:50:34.402666 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:50:34 crc kubenswrapper[4835]: E0319 09:50:34.403167 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:50:35 crc kubenswrapper[4835]: I0319 09:50:35.165994 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:35 crc kubenswrapper[4835]: I0319 09:50:35.165997 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.3:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:35 crc kubenswrapper[4835]: I0319 09:50:35.838776 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:50:35 crc kubenswrapper[4835]: I0319 09:50:35.869435 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:50:36 crc kubenswrapper[4835]: I0319 09:50:36.727885 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:50:36 crc kubenswrapper[4835]: I0319 09:50:36.885363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 09:50:38 crc kubenswrapper[4835]: I0319 09:50:38.237629 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:50:38 crc kubenswrapper[4835]: I0319 09:50:38.237974 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:50:39 crc kubenswrapper[4835]: I0319 09:50:39.319920 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:39 crc kubenswrapper[4835]: I0319 09:50:39.319920 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:42 crc kubenswrapper[4835]: I0319 09:50:42.150728 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:50:42 crc kubenswrapper[4835]: I0319 09:50:42.152028 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:50:44 crc kubenswrapper[4835]: I0319 09:50:44.155961 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:50:44 crc kubenswrapper[4835]: I0319 09:50:44.157593 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:50:44 crc kubenswrapper[4835]: I0319 09:50:44.161415 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:50:44 crc kubenswrapper[4835]: I0319 09:50:44.809023 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.238257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.238571 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.533353 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.618886 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpwbj\" (UniqueName: \"kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj\") pod \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.618939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle\") pod \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.619093 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data\") pod \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\" (UID: \"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe\") " Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.625401 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj" (OuterVolumeSpecName: "kube-api-access-kpwbj") pod "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" (UID: "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe"). InnerVolumeSpecName "kube-api-access-kpwbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.660288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data" (OuterVolumeSpecName: "config-data") pod "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" (UID: "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.672697 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" (UID: "0b2ecdd4-520f-43bc-9b83-aafa999c5dbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.722077 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpwbj\" (UniqueName: \"kubernetes.io/projected/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-kube-api-access-kpwbj\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.722118 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.722126 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.827661 4835 generic.go:334] "Generic (PLEG): container finished" podID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" containerID="05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053" exitCode=137 Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.827711 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.827782 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe","Type":"ContainerDied","Data":"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053"} Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.827829 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2ecdd4-520f-43bc-9b83-aafa999c5dbe","Type":"ContainerDied","Data":"c6fb972a28216fd76aa3cd1d45f9e4032b164f8043273871e1b38462e563a420"} Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.827855 4835 scope.go:117] "RemoveContainer" containerID="05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.867727 4835 scope.go:117] "RemoveContainer" containerID="05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053" Mar 19 09:50:46 crc kubenswrapper[4835]: E0319 09:50:46.868294 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053\": container with ID starting with 05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053 not found: ID does not exist" containerID="05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.868346 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053"} err="failed to get container status \"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053\": rpc error: code = NotFound desc = could not find container \"05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053\": container with ID starting with 05e455e50ad1afda46c218fe6cc469815cdbe961a5efb688d1f5a1fc57ba7053 not found: ID does not exist" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.873545 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.917085 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.932699 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:46 crc kubenswrapper[4835]: E0319 09:50:46.934043 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.934070 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.934361 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.936299 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.938639 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.938906 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.939025 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 09:50:46 crc kubenswrapper[4835]: I0319 09:50:46.949265 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.031457 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.031619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.031651 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.031778 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.031866 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6862\" (UniqueName: \"kubernetes.io/projected/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-kube-api-access-w6862\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.134612 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.134789 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6862\" (UniqueName: \"kubernetes.io/projected/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-kube-api-access-w6862\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.134889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.135061 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.135104 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.138553 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.138996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.139423 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.149209 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.154307 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6862\" (UniqueName: \"kubernetes.io/projected/a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2-kube-api-access-w6862\") pod \"nova-cell1-novncproxy-0\" (UID: \"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.259316 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.750688 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 09:50:47 crc kubenswrapper[4835]: W0319 09:50:47.755453 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda392e53b_b7fb_4a1a_a93f_8c5bbe49abe2.slice/crio-e7ecd9d50feece3ac5686f1014545a763ffb5a39ab8804fb871bcc8a9d592622 WatchSource:0}: Error finding container e7ecd9d50feece3ac5686f1014545a763ffb5a39ab8804fb871bcc8a9d592622: Status 404 returned error can't find the container with id e7ecd9d50feece3ac5686f1014545a763ffb5a39ab8804fb871bcc8a9d592622 Mar 19 09:50:47 crc kubenswrapper[4835]: I0319 09:50:47.841420 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2","Type":"ContainerStarted","Data":"e7ecd9d50feece3ac5686f1014545a763ffb5a39ab8804fb871bcc8a9d592622"} Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.242600 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.244829 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.247712 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.402424 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:50:48 crc kubenswrapper[4835]: E0319 09:50:48.403823 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.416753 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2ecdd4-520f-43bc-9b83-aafa999c5dbe" path="/var/lib/kubelet/pods/0b2ecdd4-520f-43bc-9b83-aafa999c5dbe/volumes" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.852678 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2","Type":"ContainerStarted","Data":"5e504d922f3d3b4411016f90a6aadf98f946986d35a00ebf43eced2f864c3b5f"} Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.857040 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:50:48 crc kubenswrapper[4835]: I0319 09:50:48.893370 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.8933515549999997 podStartE2EDuration="2.893351555s" podCreationTimestamp="2026-03-19 09:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:48.87547185 +0000 UTC m=+1703.724070467" watchObservedRunningTime="2026-03-19 09:50:48.893351555 +0000 UTC m=+1703.741950142" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.124816 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.127206 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.157485 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.197644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.197718 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jr77\" (UniqueName: \"kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.197844 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.197876 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.198005 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.198119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300118 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300234 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jr77\" (UniqueName: \"kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300361 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.300513 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.301294 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.301459 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.301519 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.301626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.301853 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.324813 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jr77\" (UniqueName: \"kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77\") pod \"dnsmasq-dns-6b7bbf7cf9-52r8x\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.461137 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:49 crc kubenswrapper[4835]: I0319 09:50:49.974027 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:50:50 crc kubenswrapper[4835]: I0319 09:50:50.392709 4835 scope.go:117] "RemoveContainer" containerID="0764a3055bee00d259054081e109654843a781a4f83ab67185d4d6c8e3e3f6c3" Mar 19 09:50:50 crc kubenswrapper[4835]: I0319 09:50:50.898645 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c128061-db8c-4a2a-84a7-38c892394c54" containerID="8f44ba384351f457566a7505c99e0f53aebfb47044d3b29d5fe83c9a67b7eb6c" exitCode=0 Mar 19 09:50:50 crc kubenswrapper[4835]: I0319 09:50:50.925808 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" event={"ID":"0c128061-db8c-4a2a-84a7-38c892394c54","Type":"ContainerDied","Data":"8f44ba384351f457566a7505c99e0f53aebfb47044d3b29d5fe83c9a67b7eb6c"} Mar 19 09:50:50 crc kubenswrapper[4835]: I0319 09:50:50.925858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" event={"ID":"0c128061-db8c-4a2a-84a7-38c892394c54","Type":"ContainerStarted","Data":"55d6383525884326a1e3ad8dd5292eef162f6b6c5a5641d2c666b2cd31e3cac0"} Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.777609 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.894242 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.894917 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-central-agent" containerID="cri-o://988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.894955 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="proxy-httpd" containerID="cri-o://ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.894981 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="sg-core" containerID="cri-o://0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.895061 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-notification-agent" containerID="cri-o://dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.913716 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" event={"ID":"0c128061-db8c-4a2a-84a7-38c892394c54","Type":"ContainerStarted","Data":"1ae2a04a8a094edc99eb24f5c8dafe349f2fcf34c3a75e7ce6ed42af9bf3ace4"} Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.913847 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-log" containerID="cri-o://6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.914725 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-api" containerID="cri-o://5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6" gracePeriod=30 Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.944316 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" podStartSLOduration=2.944291696 podStartE2EDuration="2.944291696s" podCreationTimestamp="2026-03-19 09:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:51.93655331 +0000 UTC m=+1706.785151887" watchObservedRunningTime="2026-03-19 09:50:51.944291696 +0000 UTC m=+1706.792890283" Mar 19 09:50:51 crc kubenswrapper[4835]: I0319 09:50:51.996589 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.4:3000/\": read tcp 10.217.0.2:49602->10.217.1.4:3000: read: connection reset by peer" Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.260317 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926165 4835 generic.go:334] "Generic (PLEG): container finished" podID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerID="ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788" exitCode=0 Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926462 4835 generic.go:334] "Generic (PLEG): container finished" podID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerID="0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4" exitCode=2 Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926470 4835 generic.go:334] "Generic (PLEG): container finished" podID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerID="988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e" exitCode=0 Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerDied","Data":"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788"} Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerDied","Data":"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4"} Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.926555 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerDied","Data":"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e"} Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.928236 4835 generic.go:334] "Generic (PLEG): container finished" podID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerID="6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10" exitCode=143 Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.928388 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerDied","Data":"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10"} Mar 19 09:50:52 crc kubenswrapper[4835]: I0319 09:50:52.928513 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.100500 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9cb07c1b-0e6d-4511-ad45-910d6184955f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.244:3000/\": dial tcp 10.217.0.244:3000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.775731 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.835333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.835972 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb4lt\" (UniqueName: \"kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836031 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836289 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836340 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836373 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836446 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd\") pod \"38a5570d-749f-4582-a565-c3e5c19c0c5b\" (UID: \"38a5570d-749f-4582-a565-c3e5c19c0c5b\") " Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.836431 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.840061 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.841863 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.841904 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38a5570d-749f-4582-a565-c3e5c19c0c5b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.853082 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt" (OuterVolumeSpecName: "kube-api-access-gb4lt") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "kube-api-access-gb4lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.881941 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.889579 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts" (OuterVolumeSpecName: "scripts") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.944613 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb4lt\" (UniqueName: \"kubernetes.io/projected/38a5570d-749f-4582-a565-c3e5c19c0c5b-kube-api-access-gb4lt\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.944645 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.944654 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.953517 4835 generic.go:334] "Generic (PLEG): container finished" podID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerID="dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca" exitCode=0 Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.953643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerDied","Data":"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca"} Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.953670 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38a5570d-749f-4582-a565-c3e5c19c0c5b","Type":"ContainerDied","Data":"3b699eab2b26f03d5aa71cf89fcc98ca5253b8b9de4dc21fae96bb75faf099f0"} Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.953688 4835 scope.go:117] "RemoveContainer" containerID="ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.953958 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:54 crc kubenswrapper[4835]: I0319 09:50:54.962823 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.001449 4835 scope.go:117] "RemoveContainer" containerID="0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.003480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data" (OuterVolumeSpecName: "config-data") pod "38a5570d-749f-4582-a565-c3e5c19c0c5b" (UID: "38a5570d-749f-4582-a565-c3e5c19c0c5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.024939 4835 scope.go:117] "RemoveContainer" containerID="dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.046580 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.046616 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38a5570d-749f-4582-a565-c3e5c19c0c5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.047571 4835 scope.go:117] "RemoveContainer" containerID="988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.073365 4835 scope.go:117] "RemoveContainer" containerID="ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.074064 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788\": container with ID starting with ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788 not found: ID does not exist" containerID="ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.074117 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788"} err="failed to get container status \"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788\": rpc error: code = NotFound desc = could not find container \"ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788\": container with ID starting with ac5a653b2cd1d3bdaab77409de3b666882360e69c8e86902fca5c810d6769788 not found: ID does not exist" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.074151 4835 scope.go:117] "RemoveContainer" containerID="0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.074610 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4\": container with ID starting with 0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4 not found: ID does not exist" containerID="0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.074643 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4"} err="failed to get container status \"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4\": rpc error: code = NotFound desc = could not find container \"0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4\": container with ID starting with 0c9357e6d435c38844390b66d76ed5c1e617423a746ed54027c7e31fcf81e1a4 not found: ID does not exist" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.074662 4835 scope.go:117] "RemoveContainer" containerID="dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.075081 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca\": container with ID starting with dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca not found: ID does not exist" containerID="dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.075125 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca"} err="failed to get container status \"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca\": rpc error: code = NotFound desc = could not find container \"dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca\": container with ID starting with dfe5d50a045e8a1627b742562d4d2641deb2a46c45a59a04e93819c072f984ca not found: ID does not exist" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.075157 4835 scope.go:117] "RemoveContainer" containerID="988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.075714 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e\": container with ID starting with 988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e not found: ID does not exist" containerID="988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.075754 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e"} err="failed to get container status \"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e\": rpc error: code = NotFound desc = could not find container \"988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e\": container with ID starting with 988ff005552cfd2cb7758b8302c82cbd8d2a0a9b884b404508700f509122972e not found: ID does not exist" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.306810 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.318129 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.340950 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.341661 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-notification-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.341689 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-notification-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.341777 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="sg-core" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.341786 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="sg-core" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.341812 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="proxy-httpd" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.341819 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="proxy-httpd" Mar 19 09:50:55 crc kubenswrapper[4835]: E0319 09:50:55.341846 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-central-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.341854 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-central-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.342129 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-notification-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.342172 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="ceilometer-central-agent" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.342193 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="sg-core" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.342204 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" containerName="proxy-httpd" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.344543 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.347287 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.347674 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.353683 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.461606 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462034 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462517 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfj2\" (UniqueName: \"kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462647 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462730 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.462863 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.565423 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfj2\" (UniqueName: \"kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.565808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.566374 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.566717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.566676 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.566292 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.567682 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.567919 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.568013 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.579321 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.579437 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.579541 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.579666 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.582371 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfj2\" (UniqueName: \"kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2\") pod \"ceilometer-0\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.646233 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.719362 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.772258 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs\") pod \"3c84b5cc-ce4f-4610-9806-9cec7262787a\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.772367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle\") pod \"3c84b5cc-ce4f-4610-9806-9cec7262787a\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.772478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6x6m\" (UniqueName: \"kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m\") pod \"3c84b5cc-ce4f-4610-9806-9cec7262787a\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.772579 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data\") pod \"3c84b5cc-ce4f-4610-9806-9cec7262787a\" (UID: \"3c84b5cc-ce4f-4610-9806-9cec7262787a\") " Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.773394 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs" (OuterVolumeSpecName: "logs") pod "3c84b5cc-ce4f-4610-9806-9cec7262787a" (UID: "3c84b5cc-ce4f-4610-9806-9cec7262787a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.774003 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c84b5cc-ce4f-4610-9806-9cec7262787a-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.781335 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m" (OuterVolumeSpecName: "kube-api-access-k6x6m") pod "3c84b5cc-ce4f-4610-9806-9cec7262787a" (UID: "3c84b5cc-ce4f-4610-9806-9cec7262787a"). InnerVolumeSpecName "kube-api-access-k6x6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.812923 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c84b5cc-ce4f-4610-9806-9cec7262787a" (UID: "3c84b5cc-ce4f-4610-9806-9cec7262787a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.833868 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data" (OuterVolumeSpecName: "config-data") pod "3c84b5cc-ce4f-4610-9806-9cec7262787a" (UID: "3c84b5cc-ce4f-4610-9806-9cec7262787a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.876004 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.876270 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6x6m\" (UniqueName: \"kubernetes.io/projected/3c84b5cc-ce4f-4610-9806-9cec7262787a-kube-api-access-k6x6m\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.876285 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c84b5cc-ce4f-4610-9806-9cec7262787a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.976132 4835 generic.go:334] "Generic (PLEG): container finished" podID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerID="5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6" exitCode=0 Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.976257 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.976426 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerDied","Data":"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6"} Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.976490 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c84b5cc-ce4f-4610-9806-9cec7262787a","Type":"ContainerDied","Data":"3422cfcb2a9f9656cd869e087ad0822d9ddd04266e969c49ed9087fba75dd97e"} Mar 19 09:50:55 crc kubenswrapper[4835]: I0319 09:50:55.976513 4835 scope.go:117] "RemoveContainer" containerID="5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.014820 4835 scope.go:117] "RemoveContainer" containerID="6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.022104 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.036112 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.056892 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: E0319 09:50:56.058925 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-log" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.059002 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-log" Mar 19 09:50:56 crc kubenswrapper[4835]: E0319 09:50:56.059061 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-api" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.059072 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-api" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.059440 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-log" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.059467 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" containerName="nova-api-api" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.060037 4835 scope.go:117] "RemoveContainer" containerID="5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6" Mar 19 09:50:56 crc kubenswrapper[4835]: E0319 09:50:56.060473 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6\": container with ID starting with 5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6 not found: ID does not exist" containerID="5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.060515 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6"} err="failed to get container status \"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6\": rpc error: code = NotFound desc = could not find container \"5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6\": container with ID starting with 5d68c230ffb508443999c33fecbc992682cb12796eacbeaa1c3804c7d14a9de6 not found: ID does not exist" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.060537 4835 scope.go:117] "RemoveContainer" containerID="6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.061978 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: E0319 09:50:56.065124 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10\": container with ID starting with 6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10 not found: ID does not exist" containerID="6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.065169 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10"} err="failed to get container status \"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10\": rpc error: code = NotFound desc = could not find container \"6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10\": container with ID starting with 6d3cd462c53175b75456ea29f660546e395e00d6a824fd548f37eb37f37e0d10 not found: ID does not exist" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.066492 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.066605 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.066668 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.072114 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.190062 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.190134 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.190250 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqslf\" (UniqueName: \"kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.190529 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.190819 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.191076 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: W0319 09:50:56.280995 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad121ad3_667b_4b8a_a2ef_29e24a85c34a.slice/crio-586cb92e76279c21ef7b018fa794298224915b1f0e6feab5468ec4fded318b56 WatchSource:0}: Error finding container 586cb92e76279c21ef7b018fa794298224915b1f0e6feab5468ec4fded318b56: Status 404 returned error can't find the container with id 586cb92e76279c21ef7b018fa794298224915b1f0e6feab5468ec4fded318b56 Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.285441 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293067 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293168 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293199 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293233 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqslf\" (UniqueName: \"kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293368 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.293883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.298530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.298916 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.299429 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.299782 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.311996 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqslf\" (UniqueName: \"kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf\") pod \"nova-api-0\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.398573 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.422082 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a5570d-749f-4582-a565-c3e5c19c0c5b" path="/var/lib/kubelet/pods/38a5570d-749f-4582-a565-c3e5c19c0c5b/volumes" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.422866 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c84b5cc-ce4f-4610-9806-9cec7262787a" path="/var/lib/kubelet/pods/3c84b5cc-ce4f-4610-9806-9cec7262787a/volumes" Mar 19 09:50:56 crc kubenswrapper[4835]: I0319 09:50:56.912301 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:50:56 crc kubenswrapper[4835]: W0319 09:50:56.915785 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68785834_c43e_449a_a139_7e319bfaa958.slice/crio-710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64 WatchSource:0}: Error finding container 710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64: Status 404 returned error can't find the container with id 710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64 Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.003347 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerStarted","Data":"1a41f6899c4b9fe9b6b992a8c8b14c598d3f7c5513569f7f32486d8cdfb3e9c3"} Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.003396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerStarted","Data":"586cb92e76279c21ef7b018fa794298224915b1f0e6feab5468ec4fded318b56"} Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.009422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerDied","Data":"c30856a682f79fafcc2499a2795a035e1bce1af1b9a64aa38ab07530a2d46d0b"} Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.009489 4835 generic.go:334] "Generic (PLEG): container finished" podID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerID="c30856a682f79fafcc2499a2795a035e1bce1af1b9a64aa38ab07530a2d46d0b" exitCode=137 Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.016536 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerStarted","Data":"710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64"} Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.259997 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.289571 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.708903 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.768652 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data\") pod \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.768948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts\") pod \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.769044 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7rp\" (UniqueName: \"kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp\") pod \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.769087 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle\") pod \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\" (UID: \"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1\") " Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.780331 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp" (OuterVolumeSpecName: "kube-api-access-cr7rp") pod "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" (UID: "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1"). InnerVolumeSpecName "kube-api-access-cr7rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.782996 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts" (OuterVolumeSpecName: "scripts") pod "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" (UID: "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.871568 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.871605 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr7rp\" (UniqueName: \"kubernetes.io/projected/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-kube-api-access-cr7rp\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.953705 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data" (OuterVolumeSpecName: "config-data") pod "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" (UID: "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.966468 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" (UID: "7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.974357 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:57 crc kubenswrapper[4835]: I0319 09:50:57.974400 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.030596 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerStarted","Data":"fb4ae9fffc327fd8461a8905f2d3c462ae630d627246cff005daac50646b323a"} Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.033328 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1","Type":"ContainerDied","Data":"c3035c3d797987dc426d823101c7047e77b8a1eef81851b27513d63110a899bb"} Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.033416 4835 scope.go:117] "RemoveContainer" containerID="c30856a682f79fafcc2499a2795a035e1bce1af1b9a64aa38ab07530a2d46d0b" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.033606 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.052108 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerStarted","Data":"137d6de81bc70fdd5e2a8bc3c1cd57619f849b134667c0709d7b87541570ceae"} Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.052247 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerStarted","Data":"183e270f0a4b7455c63e8606035068f346d5b14cc376addd9580d0a57304a81b"} Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.077983 4835 scope.go:117] "RemoveContainer" containerID="da7defb6a8d0db3b519ca0428186dd8b848d699da13a2f7660ea692b54bdf642" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.081566 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.084559 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.08453916 podStartE2EDuration="2.08453916s" podCreationTimestamp="2026-03-19 09:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:50:58.074699968 +0000 UTC m=+1712.923298555" watchObservedRunningTime="2026-03-19 09:50:58.08453916 +0000 UTC m=+1712.933137747" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.106342 4835 scope.go:117] "RemoveContainer" containerID="16f4ed4345a8836d91b8d99c7107598c9e21972bd841c7dd8a1f17bd53fdeacd" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.118656 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.138210 4835 scope.go:117] "RemoveContainer" containerID="adc5d7c2b66c221d0c916644b0727b87690fa775f05fa48bd92ba903f0b9aabf" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.148856 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.178908 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:58 crc kubenswrapper[4835]: E0319 09:50:58.179823 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-api" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.179850 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-api" Mar 19 09:50:58 crc kubenswrapper[4835]: E0319 09:50:58.179908 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-notifier" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.179918 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-notifier" Mar 19 09:50:58 crc kubenswrapper[4835]: E0319 09:50:58.179936 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-evaluator" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.179945 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-evaluator" Mar 19 09:50:58 crc kubenswrapper[4835]: E0319 09:50:58.179989 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-listener" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.180015 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-listener" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.180304 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-evaluator" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.180336 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-api" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.180357 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-notifier" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.180368 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" containerName="aodh-listener" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.183501 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.189422 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.189920 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fxchn" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.190461 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.190682 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.190927 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.240159 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287425 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287533 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwhkw\" (UniqueName: \"kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287571 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287630 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.287871 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.390872 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.391315 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.391364 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwhkw\" (UniqueName: \"kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.391394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.391439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.391627 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.395633 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.399108 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.399276 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.400743 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.418070 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1" path="/var/lib/kubelet/pods/7fb13de1-3e5d-4eeb-b0f7-d8f4e3cb44d1/volumes" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.429762 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bv9wk"] Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.432655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.434453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.437232 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.437400 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.443771 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bv9wk"] Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.445416 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwhkw\" (UniqueName: \"kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw\") pod \"aodh-0\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.493284 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.493393 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.493497 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5vv\" (UniqueName: \"kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.493556 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.534216 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.595434 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.595669 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5vv\" (UniqueName: \"kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.595804 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.596193 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.600169 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.606914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.607241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.619306 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5vv\" (UniqueName: \"kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv\") pod \"nova-cell1-cell-mapping-bv9wk\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:58 crc kubenswrapper[4835]: I0319 09:50:58.721235 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.064146 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerStarted","Data":"9c1ed7172f8fb64afc212c193834b28adb10a130a8519197c976b7e05cc32887"} Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.105935 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:50:59 crc kubenswrapper[4835]: W0319 09:50:59.236849 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0149d19_3745_411a_94de_c66590906efb.slice/crio-c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414 WatchSource:0}: Error finding container c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414: Status 404 returned error can't find the container with id c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414 Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.237994 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bv9wk"] Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.464320 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.524622 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:50:59 crc kubenswrapper[4835]: I0319 09:50:59.525172 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="dnsmasq-dns" containerID="cri-o://7fdce34f0b15010cf36afd31b2bf5dccd827fd202633a8b77615e64ee2be71ec" gracePeriod=10 Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.092408 4835 generic.go:334] "Generic (PLEG): container finished" podID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerID="7fdce34f0b15010cf36afd31b2bf5dccd827fd202633a8b77615e64ee2be71ec" exitCode=0 Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.092474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" event={"ID":"b139b769-9c57-4ef8-b917-06fc8a1a2aca","Type":"ContainerDied","Data":"7fdce34f0b15010cf36afd31b2bf5dccd827fd202633a8b77615e64ee2be71ec"} Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.101128 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerStarted","Data":"39167dd947c82f81bf51730d8a4ca1ad7191ef88c351d2338f695a4a1e817e9d"} Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.105131 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bv9wk" event={"ID":"d0149d19-3745-411a-94de-c66590906efb","Type":"ContainerStarted","Data":"f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318"} Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.105171 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bv9wk" event={"ID":"d0149d19-3745-411a-94de-c66590906efb","Type":"ContainerStarted","Data":"c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414"} Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.131103 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bv9wk" podStartSLOduration=2.131082923 podStartE2EDuration="2.131082923s" podCreationTimestamp="2026-03-19 09:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:00.121872408 +0000 UTC m=+1714.970471095" watchObservedRunningTime="2026-03-19 09:51:00.131082923 +0000 UTC m=+1714.979681510" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.162034 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.292728 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.293196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.293238 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.293307 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.293360 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.293404 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2vg9\" (UniqueName: \"kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9\") pod \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\" (UID: \"b139b769-9c57-4ef8-b917-06fc8a1a2aca\") " Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.304507 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9" (OuterVolumeSpecName: "kube-api-access-b2vg9") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "kube-api-access-b2vg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.371085 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.393209 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config" (OuterVolumeSpecName: "config") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.399097 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.399134 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2vg9\" (UniqueName: \"kubernetes.io/projected/b139b769-9c57-4ef8-b917-06fc8a1a2aca-kube-api-access-b2vg9\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.399148 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.407209 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.417227 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.433047 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b139b769-9c57-4ef8-b917-06fc8a1a2aca" (UID: "b139b769-9c57-4ef8-b917-06fc8a1a2aca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.501526 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.501572 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:00 crc kubenswrapper[4835]: I0319 09:51:00.501584 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b139b769-9c57-4ef8-b917-06fc8a1a2aca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.119626 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" event={"ID":"b139b769-9c57-4ef8-b917-06fc8a1a2aca","Type":"ContainerDied","Data":"c434046edafd05669346b25e2bbbf0c853f32d389c61d268e718f36393e7ece8"} Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.119944 4835 scope.go:117] "RemoveContainer" containerID="7fdce34f0b15010cf36afd31b2bf5dccd827fd202633a8b77615e64ee2be71ec" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.119648 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-vkfwv" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.123136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerStarted","Data":"a5a1af55b1fa7fb11130cc41302d05f94330b6621f6b0a878733eb89d5a76b8b"} Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.133139 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerStarted","Data":"c17a1fbebb5fbf5951b8313b8eb7bad23b3847c2ed27a257bd5d75e0c460a2af"} Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.133597 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.169805 4835 scope.go:117] "RemoveContainer" containerID="cc4c66b25d2e1ef20810f9468b887be6093cc74e2e5ef52d7173dcc87d8342aa" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.183767 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.054912109 podStartE2EDuration="6.183717156s" podCreationTimestamp="2026-03-19 09:50:55 +0000 UTC" firstStartedPulling="2026-03-19 09:50:56.283593126 +0000 UTC m=+1711.132191713" lastFinishedPulling="2026-03-19 09:51:00.412398172 +0000 UTC m=+1715.260996760" observedRunningTime="2026-03-19 09:51:01.162643376 +0000 UTC m=+1716.011241983" watchObservedRunningTime="2026-03-19 09:51:01.183717156 +0000 UTC m=+1716.032315743" Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.193932 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.207829 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-vkfwv"] Mar 19 09:51:01 crc kubenswrapper[4835]: I0319 09:51:01.402803 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:51:01 crc kubenswrapper[4835]: E0319 09:51:01.403078 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:51:02 crc kubenswrapper[4835]: I0319 09:51:02.148366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerStarted","Data":"0abec8912e1e3cf610978870c8382031d5df50d077218567a796c1f72aff6f06"} Mar 19 09:51:02 crc kubenswrapper[4835]: I0319 09:51:02.148720 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerStarted","Data":"0a792066ff864b6c9eadfa0d11e82cba678fa3742c38680dcd693050695f7d86"} Mar 19 09:51:02 crc kubenswrapper[4835]: I0319 09:51:02.448718 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" path="/var/lib/kubelet/pods/b139b769-9c57-4ef8-b917-06fc8a1a2aca/volumes" Mar 19 09:51:03 crc kubenswrapper[4835]: I0319 09:51:03.164287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerStarted","Data":"e1f5e7c55c5ae5d9b0f812f5e32b130ace81f80ae20441972c84e26135e992a0"} Mar 19 09:51:03 crc kubenswrapper[4835]: I0319 09:51:03.196470 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.805264389 podStartE2EDuration="5.196448899s" podCreationTimestamp="2026-03-19 09:50:58 +0000 UTC" firstStartedPulling="2026-03-19 09:50:59.091699402 +0000 UTC m=+1713.940297989" lastFinishedPulling="2026-03-19 09:51:02.482883912 +0000 UTC m=+1717.331482499" observedRunningTime="2026-03-19 09:51:03.192304288 +0000 UTC m=+1718.040902895" watchObservedRunningTime="2026-03-19 09:51:03.196448899 +0000 UTC m=+1718.045047496" Mar 19 09:51:06 crc kubenswrapper[4835]: E0319 09:51:06.272687 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0149d19_3745_411a_94de_c66590906efb.slice/crio-f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0149d19_3745_411a_94de_c66590906efb.slice/crio-conmon-f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:51:06 crc kubenswrapper[4835]: I0319 09:51:06.399448 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:51:06 crc kubenswrapper[4835]: I0319 09:51:06.399498 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:51:06 crc kubenswrapper[4835]: I0319 09:51:06.489531 4835 generic.go:334] "Generic (PLEG): container finished" podID="d0149d19-3745-411a-94de-c66590906efb" containerID="f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318" exitCode=0 Mar 19 09:51:06 crc kubenswrapper[4835]: I0319 09:51:06.489580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bv9wk" event={"ID":"d0149d19-3745-411a-94de-c66590906efb","Type":"ContainerDied","Data":"f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318"} Mar 19 09:51:07 crc kubenswrapper[4835]: I0319 09:51:07.410900 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.11:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:07 crc kubenswrapper[4835]: I0319 09:51:07.410954 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.11:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.029477 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.080119 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data\") pod \"d0149d19-3745-411a-94de-c66590906efb\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.080180 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts\") pod \"d0149d19-3745-411a-94de-c66590906efb\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.080266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle\") pod \"d0149d19-3745-411a-94de-c66590906efb\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.080376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd5vv\" (UniqueName: \"kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv\") pod \"d0149d19-3745-411a-94de-c66590906efb\" (UID: \"d0149d19-3745-411a-94de-c66590906efb\") " Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.086817 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv" (OuterVolumeSpecName: "kube-api-access-jd5vv") pod "d0149d19-3745-411a-94de-c66590906efb" (UID: "d0149d19-3745-411a-94de-c66590906efb"). InnerVolumeSpecName "kube-api-access-jd5vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.100128 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts" (OuterVolumeSpecName: "scripts") pod "d0149d19-3745-411a-94de-c66590906efb" (UID: "d0149d19-3745-411a-94de-c66590906efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.121677 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data" (OuterVolumeSpecName: "config-data") pod "d0149d19-3745-411a-94de-c66590906efb" (UID: "d0149d19-3745-411a-94de-c66590906efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.137088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0149d19-3745-411a-94de-c66590906efb" (UID: "d0149d19-3745-411a-94de-c66590906efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.182822 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.182854 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.182867 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0149d19-3745-411a-94de-c66590906efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.182879 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd5vv\" (UniqueName: \"kubernetes.io/projected/d0149d19-3745-411a-94de-c66590906efb-kube-api-access-jd5vv\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.526007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bv9wk" event={"ID":"d0149d19-3745-411a-94de-c66590906efb","Type":"ContainerDied","Data":"c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414"} Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.526062 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96a4f8781e86d48084c3a2abc1a9b2a53bcfc518d5d7a7843ef5ca9cf8b5414" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.526148 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bv9wk" Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.805819 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.806108 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerName="nova-scheduler-scheduler" containerID="cri-o://2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" gracePeriod=30 Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.825253 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.825611 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-log" containerID="cri-o://183e270f0a4b7455c63e8606035068f346d5b14cc376addd9580d0a57304a81b" gracePeriod=30 Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.825870 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-api" containerID="cri-o://137d6de81bc70fdd5e2a8bc3c1cd57619f849b134667c0709d7b87541570ceae" gracePeriod=30 Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.870683 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.871323 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-log" containerID="cri-o://7284e6b6fab45bcaa06a3923fd49d73b55785568ea4bacddd0ab41c842ed77e6" gracePeriod=30 Mar 19 09:51:08 crc kubenswrapper[4835]: I0319 09:51:08.871875 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-metadata" containerID="cri-o://1bc4d60be8da07a45ae1397a34e240c6d48252301fb68ac1491ddfa827c767f1" gracePeriod=30 Mar 19 09:51:09 crc kubenswrapper[4835]: I0319 09:51:09.542911 4835 generic.go:334] "Generic (PLEG): container finished" podID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerID="7284e6b6fab45bcaa06a3923fd49d73b55785568ea4bacddd0ab41c842ed77e6" exitCode=143 Mar 19 09:51:09 crc kubenswrapper[4835]: I0319 09:51:09.543019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerDied","Data":"7284e6b6fab45bcaa06a3923fd49d73b55785568ea4bacddd0ab41c842ed77e6"} Mar 19 09:51:09 crc kubenswrapper[4835]: I0319 09:51:09.546313 4835 generic.go:334] "Generic (PLEG): container finished" podID="68785834-c43e-449a-a139-7e319bfaa958" containerID="183e270f0a4b7455c63e8606035068f346d5b14cc376addd9580d0a57304a81b" exitCode=143 Mar 19 09:51:09 crc kubenswrapper[4835]: I0319 09:51:09.546369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerDied","Data":"183e270f0a4b7455c63e8606035068f346d5b14cc376addd9580d0a57304a81b"} Mar 19 09:51:10 crc kubenswrapper[4835]: E0319 09:51:10.839682 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 is running failed: container process not found" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:51:10 crc kubenswrapper[4835]: E0319 09:51:10.840753 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 is running failed: container process not found" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:51:10 crc kubenswrapper[4835]: E0319 09:51:10.841213 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 is running failed: container process not found" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 09:51:10 crc kubenswrapper[4835]: E0319 09:51:10.841245 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerName="nova-scheduler-scheduler" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.387522 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.471343 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data\") pod \"f6bb0461-be40-4d05-893b-c9b3cc97c134\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.471384 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54qbf\" (UniqueName: \"kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf\") pod \"f6bb0461-be40-4d05-893b-c9b3cc97c134\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.471579 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle\") pod \"f6bb0461-be40-4d05-893b-c9b3cc97c134\" (UID: \"f6bb0461-be40-4d05-893b-c9b3cc97c134\") " Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.480067 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf" (OuterVolumeSpecName: "kube-api-access-54qbf") pod "f6bb0461-be40-4d05-893b-c9b3cc97c134" (UID: "f6bb0461-be40-4d05-893b-c9b3cc97c134"). InnerVolumeSpecName "kube-api-access-54qbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.504480 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6bb0461-be40-4d05-893b-c9b3cc97c134" (UID: "f6bb0461-be40-4d05-893b-c9b3cc97c134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.506660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data" (OuterVolumeSpecName: "config-data") pod "f6bb0461-be40-4d05-893b-c9b3cc97c134" (UID: "f6bb0461-be40-4d05-893b-c9b3cc97c134"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.570928 4835 generic.go:334] "Generic (PLEG): container finished" podID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" exitCode=0 Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.570987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6bb0461-be40-4d05-893b-c9b3cc97c134","Type":"ContainerDied","Data":"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5"} Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.571023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6bb0461-be40-4d05-893b-c9b3cc97c134","Type":"ContainerDied","Data":"2ea6a9fff14e1ee8a78c09af7eaf845529d521b17215cf033968867f7959641d"} Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.571048 4835 scope.go:117] "RemoveContainer" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.571230 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.579480 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.579515 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6bb0461-be40-4d05-893b-c9b3cc97c134-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.579533 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54qbf\" (UniqueName: \"kubernetes.io/projected/f6bb0461-be40-4d05-893b-c9b3cc97c134-kube-api-access-54qbf\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.602028 4835 scope.go:117] "RemoveContainer" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" Mar 19 09:51:11 crc kubenswrapper[4835]: E0319 09:51:11.602493 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5\": container with ID starting with 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 not found: ID does not exist" containerID="2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.602541 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5"} err="failed to get container status \"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5\": rpc error: code = NotFound desc = could not find container \"2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5\": container with ID starting with 2009c8301b5ab6fc93a8cc2b9fa8ab86878b9b32dec14bd4965dccd1991aedb5 not found: ID does not exist" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.620165 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.638179 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.656593 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:11 crc kubenswrapper[4835]: E0319 09:51:11.657194 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0149d19-3745-411a-94de-c66590906efb" containerName="nova-manage" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657216 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0149d19-3745-411a-94de-c66590906efb" containerName="nova-manage" Mar 19 09:51:11 crc kubenswrapper[4835]: E0319 09:51:11.657236 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerName="nova-scheduler-scheduler" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657242 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerName="nova-scheduler-scheduler" Mar 19 09:51:11 crc kubenswrapper[4835]: E0319 09:51:11.657259 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="init" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657264 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="init" Mar 19 09:51:11 crc kubenswrapper[4835]: E0319 09:51:11.657273 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="dnsmasq-dns" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657279 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="dnsmasq-dns" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657503 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" containerName="nova-scheduler-scheduler" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657526 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0149d19-3745-411a-94de-c66590906efb" containerName="nova-manage" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.657539 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b139b769-9c57-4ef8-b917-06fc8a1a2aca" containerName="dnsmasq-dns" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.658382 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.660491 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.672137 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.681155 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.681244 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s49zm\" (UniqueName: \"kubernetes.io/projected/ab42da98-c45d-473c-b86c-acba11787d21-kube-api-access-s49zm\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.681357 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-config-data\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.783613 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.783717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s49zm\" (UniqueName: \"kubernetes.io/projected/ab42da98-c45d-473c-b86c-acba11787d21-kube-api-access-s49zm\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.783808 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-config-data\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.792375 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-config-data\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.793225 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab42da98-c45d-473c-b86c-acba11787d21-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.799625 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s49zm\" (UniqueName: \"kubernetes.io/projected/ab42da98-c45d-473c-b86c-acba11787d21-kube-api-access-s49zm\") pod \"nova-scheduler-0\" (UID: \"ab42da98-c45d-473c-b86c-acba11787d21\") " pod="openstack/nova-scheduler-0" Mar 19 09:51:11 crc kubenswrapper[4835]: I0319 09:51:11.977771 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.417080 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bb0461-be40-4d05-893b-c9b3cc97c134" path="/var/lib/kubelet/pods/f6bb0461-be40-4d05-893b-c9b3cc97c134/volumes" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.477191 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 09:51:12 crc kubenswrapper[4835]: W0319 09:51:12.477694 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab42da98_c45d_473c_b86c_acba11787d21.slice/crio-298881cdfd2a26f04b238bd1d34c98d019d5a85919e008a41b46f1a05728b93f WatchSource:0}: Error finding container 298881cdfd2a26f04b238bd1d34c98d019d5a85919e008a41b46f1a05728b93f: Status 404 returned error can't find the container with id 298881cdfd2a26f04b238bd1d34c98d019d5a85919e008a41b46f1a05728b93f Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.584311 4835 generic.go:334] "Generic (PLEG): container finished" podID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerID="1bc4d60be8da07a45ae1397a34e240c6d48252301fb68ac1491ddfa827c767f1" exitCode=0 Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.584413 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerDied","Data":"1bc4d60be8da07a45ae1397a34e240c6d48252301fb68ac1491ddfa827c767f1"} Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.585876 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab42da98-c45d-473c-b86c-acba11787d21","Type":"ContainerStarted","Data":"298881cdfd2a26f04b238bd1d34c98d019d5a85919e008a41b46f1a05728b93f"} Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.589180 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.706804 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs\") pod \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.706897 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data\") pod \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.707092 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fbws\" (UniqueName: \"kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws\") pod \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.707243 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs\") pod \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.707338 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle\") pod \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\" (UID: \"c2efa60f-c31d-4e06-9a21-fc066054bc8d\") " Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.721215 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs" (OuterVolumeSpecName: "logs") pod "c2efa60f-c31d-4e06-9a21-fc066054bc8d" (UID: "c2efa60f-c31d-4e06-9a21-fc066054bc8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.726951 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws" (OuterVolumeSpecName: "kube-api-access-5fbws") pod "c2efa60f-c31d-4e06-9a21-fc066054bc8d" (UID: "c2efa60f-c31d-4e06-9a21-fc066054bc8d"). InnerVolumeSpecName "kube-api-access-5fbws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.758319 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data" (OuterVolumeSpecName: "config-data") pod "c2efa60f-c31d-4e06-9a21-fc066054bc8d" (UID: "c2efa60f-c31d-4e06-9a21-fc066054bc8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.758437 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2efa60f-c31d-4e06-9a21-fc066054bc8d" (UID: "c2efa60f-c31d-4e06-9a21-fc066054bc8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.788041 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c2efa60f-c31d-4e06-9a21-fc066054bc8d" (UID: "c2efa60f-c31d-4e06-9a21-fc066054bc8d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.818620 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2efa60f-c31d-4e06-9a21-fc066054bc8d-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.818664 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.818681 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.818692 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2efa60f-c31d-4e06-9a21-fc066054bc8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:12 crc kubenswrapper[4835]: I0319 09:51:12.818703 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fbws\" (UniqueName: \"kubernetes.io/projected/c2efa60f-c31d-4e06-9a21-fc066054bc8d-kube-api-access-5fbws\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.403886 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:51:13 crc kubenswrapper[4835]: E0319 09:51:13.404505 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.603580 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ab42da98-c45d-473c-b86c-acba11787d21","Type":"ContainerStarted","Data":"7ff151d877e79850b93b2bbb34d0e00f676764025e7d8d63043eefea02856a76"} Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.605929 4835 generic.go:334] "Generic (PLEG): container finished" podID="68785834-c43e-449a-a139-7e319bfaa958" containerID="137d6de81bc70fdd5e2a8bc3c1cd57619f849b134667c0709d7b87541570ceae" exitCode=0 Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.605988 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerDied","Data":"137d6de81bc70fdd5e2a8bc3c1cd57619f849b134667c0709d7b87541570ceae"} Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.606009 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"68785834-c43e-449a-a139-7e319bfaa958","Type":"ContainerDied","Data":"710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64"} Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.606022 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710179a0b23aa283945ffa8be88894e8ca38b5fc06136bed4e60048166c4ea64" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.608056 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c2efa60f-c31d-4e06-9a21-fc066054bc8d","Type":"ContainerDied","Data":"a9a923edcfcf2a1ab8601d82382705749fe34df2c3ebb535cce319d88d6df562"} Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.608092 4835 scope.go:117] "RemoveContainer" containerID="1bc4d60be8da07a45ae1397a34e240c6d48252301fb68ac1491ddfa827c767f1" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.608210 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.631104 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.631070666 podStartE2EDuration="2.631070666s" podCreationTimestamp="2026-03-19 09:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:13.624030839 +0000 UTC m=+1728.472629446" watchObservedRunningTime="2026-03-19 09:51:13.631070666 +0000 UTC m=+1728.479669293" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.709062 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.726428 4835 scope.go:117] "RemoveContainer" containerID="7284e6b6fab45bcaa06a3923fd49d73b55785568ea4bacddd0ab41c842ed77e6" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.729159 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.750070 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.761948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.762111 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.762238 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.762327 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.762380 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqslf\" (UniqueName: \"kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.762405 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle\") pod \"68785834-c43e-449a-a139-7e319bfaa958\" (UID: \"68785834-c43e-449a-a139-7e319bfaa958\") " Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.769548 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs" (OuterVolumeSpecName: "logs") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.789945 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf" (OuterVolumeSpecName: "kube-api-access-kqslf") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "kube-api-access-kqslf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.809904 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:13 crc kubenswrapper[4835]: E0319 09:51:13.810530 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-log" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.810550 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-log" Mar 19 09:51:13 crc kubenswrapper[4835]: E0319 09:51:13.810581 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-log" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.810589 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-log" Mar 19 09:51:13 crc kubenswrapper[4835]: E0319 09:51:13.810676 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-api" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.810686 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-api" Mar 19 09:51:13 crc kubenswrapper[4835]: E0319 09:51:13.810702 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-metadata" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.810710 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-metadata" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.811019 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-log" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.811055 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="68785834-c43e-449a-a139-7e319bfaa958" containerName="nova-api-api" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.811068 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-log" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.811088 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" containerName="nova-metadata-metadata" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.812514 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.816324 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.817853 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.818873 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.819830 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data" (OuterVolumeSpecName: "config-data") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.863304 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.865456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.865562 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9459cf8a-fea8-4f9b-97bd-79333e608f41-logs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.865673 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs6w9\" (UniqueName: \"kubernetes.io/projected/9459cf8a-fea8-4f9b-97bd-79333e608f41-kube-api-access-rs6w9\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.865719 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-config-data\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.865905 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.866097 4835 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68785834-c43e-449a-a139-7e319bfaa958-logs\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.866114 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqslf\" (UniqueName: \"kubernetes.io/projected/68785834-c43e-449a-a139-7e319bfaa958-kube-api-access-kqslf\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.866128 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.866140 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.878616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.897583 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68785834-c43e-449a-a139-7e319bfaa958" (UID: "68785834-c43e-449a-a139-7e319bfaa958"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.967717 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9459cf8a-fea8-4f9b-97bd-79333e608f41-logs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.967816 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs6w9\" (UniqueName: \"kubernetes.io/projected/9459cf8a-fea8-4f9b-97bd-79333e608f41-kube-api-access-rs6w9\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.967851 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-config-data\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.967940 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.968052 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.968140 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.968163 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68785834-c43e-449a-a139-7e319bfaa958-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.968253 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9459cf8a-fea8-4f9b-97bd-79333e608f41-logs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.972375 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.972455 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-config-data\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.972513 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9459cf8a-fea8-4f9b-97bd-79333e608f41-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:13 crc kubenswrapper[4835]: I0319 09:51:13.982146 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs6w9\" (UniqueName: \"kubernetes.io/projected/9459cf8a-fea8-4f9b-97bd-79333e608f41-kube-api-access-rs6w9\") pod \"nova-metadata-0\" (UID: \"9459cf8a-fea8-4f9b-97bd-79333e608f41\") " pod="openstack/nova-metadata-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.176539 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.429049 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2efa60f-c31d-4e06-9a21-fc066054bc8d" path="/var/lib/kubelet/pods/c2efa60f-c31d-4e06-9a21-fc066054bc8d/volumes" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.621427 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.658379 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.678204 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.692935 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.694939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.699623 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.699892 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.700083 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.703440 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:14 crc kubenswrapper[4835]: W0319 09:51:14.731922 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9459cf8a_fea8_4f9b_97bd_79333e608f41.slice/crio-7ee60009088692aeb66f6b50a0c7a479a5c17d930cf9cb33d1380dadee810b85 WatchSource:0}: Error finding container 7ee60009088692aeb66f6b50a0c7a479a5c17d930cf9cb33d1380dadee810b85: Status 404 returned error can't find the container with id 7ee60009088692aeb66f6b50a0c7a479a5c17d930cf9cb33d1380dadee810b85 Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.745321 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.804287 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-config-data\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.804361 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa7e56-6bac-4cbd-be4b-13961be604b5-logs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.804677 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.804810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xslm\" (UniqueName: \"kubernetes.io/projected/48aa7e56-6bac-4cbd-be4b-13961be604b5-kube-api-access-8xslm\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.804847 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.805081 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.907125 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.907537 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xslm\" (UniqueName: \"kubernetes.io/projected/48aa7e56-6bac-4cbd-be4b-13961be604b5-kube-api-access-8xslm\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.907572 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.908057 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.908586 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-config-data\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.908640 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa7e56-6bac-4cbd-be4b-13961be604b5-logs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.908959 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa7e56-6bac-4cbd-be4b-13961be604b5-logs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.913394 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-public-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.913461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.914691 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-config-data\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.926930 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa7e56-6bac-4cbd-be4b-13961be604b5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:14 crc kubenswrapper[4835]: I0319 09:51:14.935851 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xslm\" (UniqueName: \"kubernetes.io/projected/48aa7e56-6bac-4cbd-be4b-13961be604b5-kube-api-access-8xslm\") pod \"nova-api-0\" (UID: \"48aa7e56-6bac-4cbd-be4b-13961be604b5\") " pod="openstack/nova-api-0" Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.024537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.502500 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.632468 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48aa7e56-6bac-4cbd-be4b-13961be604b5","Type":"ContainerStarted","Data":"ae9375a74e9797b7904c5d21fef6e2d0374abbbd24cdca917f5b53e7330b7acb"} Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.634979 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9459cf8a-fea8-4f9b-97bd-79333e608f41","Type":"ContainerStarted","Data":"556e0a48e365a22362778f139e130675e5ded6f1e1f4295f9a294e8a90a4f456"} Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.635021 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9459cf8a-fea8-4f9b-97bd-79333e608f41","Type":"ContainerStarted","Data":"e93373c596a6ddb3ca75a695de425465f727892cfe1c2d7785204cfb7412d63f"} Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.635031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9459cf8a-fea8-4f9b-97bd-79333e608f41","Type":"ContainerStarted","Data":"7ee60009088692aeb66f6b50a0c7a479a5c17d930cf9cb33d1380dadee810b85"} Mar 19 09:51:15 crc kubenswrapper[4835]: I0319 09:51:15.652402 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.652383517 podStartE2EDuration="2.652383517s" podCreationTimestamp="2026-03-19 09:51:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:15.649590703 +0000 UTC m=+1730.498189310" watchObservedRunningTime="2026-03-19 09:51:15.652383517 +0000 UTC m=+1730.500982104" Mar 19 09:51:16 crc kubenswrapper[4835]: I0319 09:51:16.443626 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68785834-c43e-449a-a139-7e319bfaa958" path="/var/lib/kubelet/pods/68785834-c43e-449a-a139-7e319bfaa958/volumes" Mar 19 09:51:16 crc kubenswrapper[4835]: I0319 09:51:16.650439 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48aa7e56-6bac-4cbd-be4b-13961be604b5","Type":"ContainerStarted","Data":"df8932feb2e85306f148b48befc87deee48550f6d2105d0b4c29a34a9ca84e92"} Mar 19 09:51:16 crc kubenswrapper[4835]: I0319 09:51:16.650494 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48aa7e56-6bac-4cbd-be4b-13961be604b5","Type":"ContainerStarted","Data":"d465d78d52698a6f2cbf9d7aa63ccc593f35a526996806685bbf5801b83259cf"} Mar 19 09:51:16 crc kubenswrapper[4835]: I0319 09:51:16.691688 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.691665244 podStartE2EDuration="2.691665244s" podCreationTimestamp="2026-03-19 09:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:51:16.673887702 +0000 UTC m=+1731.522486309" watchObservedRunningTime="2026-03-19 09:51:16.691665244 +0000 UTC m=+1731.540263841" Mar 19 09:51:16 crc kubenswrapper[4835]: I0319 09:51:16.978053 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 09:51:21 crc kubenswrapper[4835]: I0319 09:51:21.978785 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 09:51:22 crc kubenswrapper[4835]: I0319 09:51:22.009622 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 09:51:22 crc kubenswrapper[4835]: I0319 09:51:22.751899 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 09:51:24 crc kubenswrapper[4835]: I0319 09:51:24.178164 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:51:24 crc kubenswrapper[4835]: I0319 09:51:24.179123 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 09:51:25 crc kubenswrapper[4835]: I0319 09:51:25.025455 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:51:25 crc kubenswrapper[4835]: I0319 09:51:25.025728 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 09:51:25 crc kubenswrapper[4835]: I0319 09:51:25.196932 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9459cf8a-fea8-4f9b-97bd-79333e608f41" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:25 crc kubenswrapper[4835]: I0319 09:51:25.196976 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9459cf8a-fea8-4f9b-97bd-79333e608f41" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:25 crc kubenswrapper[4835]: I0319 09:51:25.776455 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 09:51:26 crc kubenswrapper[4835]: I0319 09:51:26.080164 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48aa7e56-6bac-4cbd-be4b-13961be604b5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:26 crc kubenswrapper[4835]: I0319 09:51:26.080586 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48aa7e56-6bac-4cbd-be4b-13961be604b5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 09:51:28 crc kubenswrapper[4835]: I0319 09:51:28.402259 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:51:28 crc kubenswrapper[4835]: E0319 09:51:28.402651 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.696333 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.696870 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" containerName="kube-state-metrics" containerID="cri-o://5f0c4b09db26d1f1d4e3e16d80865c015ec28f5d0a76a0b62971a6f663cee0f2" gracePeriod=30 Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.813172 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.813816 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" containerName="mysqld-exporter" containerID="cri-o://b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac" gracePeriod=30 Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.838509 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" containerID="5f0c4b09db26d1f1d4e3e16d80865c015ec28f5d0a76a0b62971a6f663cee0f2" exitCode=2 Mar 19 09:51:30 crc kubenswrapper[4835]: I0319 09:51:30.838564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4ace457f-965b-4e28-9f09-0c49bc9df9f7","Type":"ContainerDied","Data":"5f0c4b09db26d1f1d4e3e16d80865c015ec28f5d0a76a0b62971a6f663cee0f2"} Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.406294 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.518511 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rl66\" (UniqueName: \"kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66\") pod \"4ace457f-965b-4e28-9f09-0c49bc9df9f7\" (UID: \"4ace457f-965b-4e28-9f09-0c49bc9df9f7\") " Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.527872 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66" (OuterVolumeSpecName: "kube-api-access-8rl66") pod "4ace457f-965b-4e28-9f09-0c49bc9df9f7" (UID: "4ace457f-965b-4e28-9f09-0c49bc9df9f7"). InnerVolumeSpecName "kube-api-access-8rl66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.543369 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.626488 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rl66\" (UniqueName: \"kubernetes.io/projected/4ace457f-965b-4e28-9f09-0c49bc9df9f7-kube-api-access-8rl66\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.747548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle\") pod \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.747879 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx27d\" (UniqueName: \"kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d\") pod \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.748030 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data\") pod \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\" (UID: \"677d6ad0-5fac-4b26-af6e-ed13a984e2ba\") " Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.764420 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d" (OuterVolumeSpecName: "kube-api-access-kx27d") pod "677d6ad0-5fac-4b26-af6e-ed13a984e2ba" (UID: "677d6ad0-5fac-4b26-af6e-ed13a984e2ba"). InnerVolumeSpecName "kube-api-access-kx27d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.819887 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "677d6ad0-5fac-4b26-af6e-ed13a984e2ba" (UID: "677d6ad0-5fac-4b26-af6e-ed13a984e2ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.851067 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.851109 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx27d\" (UniqueName: \"kubernetes.io/projected/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-kube-api-access-kx27d\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.852282 4835 generic.go:334] "Generic (PLEG): container finished" podID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" containerID="b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac" exitCode=2 Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.852434 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.853080 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"677d6ad0-5fac-4b26-af6e-ed13a984e2ba","Type":"ContainerDied","Data":"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac"} Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.853109 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"677d6ad0-5fac-4b26-af6e-ed13a984e2ba","Type":"ContainerDied","Data":"9677cb795b2569c7616deb8b63a9d052c8ae72bdb2735d98eb0ec48e07ed8bb7"} Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.853126 4835 scope.go:117] "RemoveContainer" containerID="b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.857326 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4ace457f-965b-4e28-9f09-0c49bc9df9f7","Type":"ContainerDied","Data":"2d1f7a25dea9c939643054ea34124f3d9d9b0f826c10f18bb82f484810626888"} Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.857421 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.870716 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data" (OuterVolumeSpecName: "config-data") pod "677d6ad0-5fac-4b26-af6e-ed13a984e2ba" (UID: "677d6ad0-5fac-4b26-af6e-ed13a984e2ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.953777 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/677d6ad0-5fac-4b26-af6e-ed13a984e2ba-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.978214 4835 scope.go:117] "RemoveContainer" containerID="b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac" Mar 19 09:51:31 crc kubenswrapper[4835]: E0319 09:51:31.978685 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac\": container with ID starting with b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac not found: ID does not exist" containerID="b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.978727 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac"} err="failed to get container status \"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac\": rpc error: code = NotFound desc = could not find container \"b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac\": container with ID starting with b09873d0b1b00b68f7ca0dc7fd991975e4195021197e9a408a71e4e1381a6bac not found: ID does not exist" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.978770 4835 scope.go:117] "RemoveContainer" containerID="5f0c4b09db26d1f1d4e3e16d80865c015ec28f5d0a76a0b62971a6f663cee0f2" Mar 19 09:51:31 crc kubenswrapper[4835]: I0319 09:51:31.989117 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.000746 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.026936 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: E0319 09:51:32.027520 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" containerName="kube-state-metrics" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.027535 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" containerName="kube-state-metrics" Mar 19 09:51:32 crc kubenswrapper[4835]: E0319 09:51:32.027548 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" containerName="mysqld-exporter" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.027554 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" containerName="mysqld-exporter" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.027799 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" containerName="kube-state-metrics" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.027835 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" containerName="mysqld-exporter" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.028793 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.032401 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.032575 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.037685 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.177370 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.177704 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.185476 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.185642 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.185687 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.186767 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszrl\" (UniqueName: \"kubernetes.io/projected/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-api-access-mszrl\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.203060 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.214759 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.230240 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.232257 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.242803 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.250104 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.250391 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.289854 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.289953 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.289982 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-config-data\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.290008 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.290033 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.290110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszrl\" (UniqueName: \"kubernetes.io/projected/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-api-access-mszrl\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.290166 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4zv\" (UniqueName: \"kubernetes.io/projected/b815f8f1-d671-416f-adb1-4001dc6891a3-kube-api-access-qr4zv\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.290219 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.296390 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.296471 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.314040 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszrl\" (UniqueName: \"kubernetes.io/projected/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-api-access-mszrl\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.314906 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/84a60e76-5b98-4cdd-b2ea-849d4fcbc215-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"84a60e76-5b98-4cdd-b2ea-849d4fcbc215\") " pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.392144 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-config-data\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.392291 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4zv\" (UniqueName: \"kubernetes.io/projected/b815f8f1-d671-416f-adb1-4001dc6891a3-kube-api-access-qr4zv\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.392338 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.392453 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.397317 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.404701 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.408339 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-config-data\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.409028 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/b815f8f1-d671-416f-adb1-4001dc6891a3-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.414317 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4zv\" (UniqueName: \"kubernetes.io/projected/b815f8f1-d671-416f-adb1-4001dc6891a3-kube-api-access-qr4zv\") pod \"mysqld-exporter-0\" (UID: \"b815f8f1-d671-416f-adb1-4001dc6891a3\") " pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.431497 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ace457f-965b-4e28-9f09-0c49bc9df9f7" path="/var/lib/kubelet/pods/4ace457f-965b-4e28-9f09-0c49bc9df9f7/volumes" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.432125 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677d6ad0-5fac-4b26-af6e-ed13a984e2ba" path="/var/lib/kubelet/pods/677d6ad0-5fac-4b26-af6e-ed13a984e2ba/volumes" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.555100 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 19 09:51:32 crc kubenswrapper[4835]: I0319 09:51:32.890778 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.025093 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.025917 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.052052 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 19 09:51:33 crc kubenswrapper[4835]: W0319 09:51:33.057072 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb815f8f1_d671_416f_adb1_4001dc6891a3.slice/crio-f909be7ccc3fae69008d92a16d7b81ad0c435cd71be92e27b13f2ac3a9660abe WatchSource:0}: Error finding container f909be7ccc3fae69008d92a16d7b81ad0c435cd71be92e27b13f2ac3a9660abe: Status 404 returned error can't find the container with id f909be7ccc3fae69008d92a16d7b81ad0c435cd71be92e27b13f2ac3a9660abe Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.399979 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.400230 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-central-agent" containerID="cri-o://1a41f6899c4b9fe9b6b992a8c8b14c598d3f7c5513569f7f32486d8cdfb3e9c3" gracePeriod=30 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.400494 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="proxy-httpd" containerID="cri-o://c17a1fbebb5fbf5951b8313b8eb7bad23b3847c2ed27a257bd5d75e0c460a2af" gracePeriod=30 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.400558 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-notification-agent" containerID="cri-o://fb4ae9fffc327fd8461a8905f2d3c462ae630d627246cff005daac50646b323a" gracePeriod=30 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.400593 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="sg-core" containerID="cri-o://9c1ed7172f8fb64afc212c193834b28adb10a130a8519197c976b7e05cc32887" gracePeriod=30 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.897084 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"84a60e76-5b98-4cdd-b2ea-849d4fcbc215","Type":"ContainerStarted","Data":"f5e366d88e5c47699c18f5303a035527820bf98f2285d39be13a681b97d1105c"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.897591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"84a60e76-5b98-4cdd-b2ea-849d4fcbc215","Type":"ContainerStarted","Data":"d7d44b54a484412688c8b15248d5b88324664cf234b58287519b5a38f5cd3eb3"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.897615 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.900175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"b815f8f1-d671-416f-adb1-4001dc6891a3","Type":"ContainerStarted","Data":"355467d3b4b4d4d86813ae3ac92098ed0ea980d724ef051f0fcc79239299ea20"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.900201 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"b815f8f1-d671-416f-adb1-4001dc6891a3","Type":"ContainerStarted","Data":"f909be7ccc3fae69008d92a16d7b81ad0c435cd71be92e27b13f2ac3a9660abe"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903734 4835 generic.go:334] "Generic (PLEG): container finished" podID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerID="c17a1fbebb5fbf5951b8313b8eb7bad23b3847c2ed27a257bd5d75e0c460a2af" exitCode=0 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903779 4835 generic.go:334] "Generic (PLEG): container finished" podID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerID="9c1ed7172f8fb64afc212c193834b28adb10a130a8519197c976b7e05cc32887" exitCode=2 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903780 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerDied","Data":"c17a1fbebb5fbf5951b8313b8eb7bad23b3847c2ed27a257bd5d75e0c460a2af"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903819 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerDied","Data":"9c1ed7172f8fb64afc212c193834b28adb10a130a8519197c976b7e05cc32887"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerDied","Data":"1a41f6899c4b9fe9b6b992a8c8b14c598d3f7c5513569f7f32486d8cdfb3e9c3"} Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.903786 4835 generic.go:334] "Generic (PLEG): container finished" podID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerID="1a41f6899c4b9fe9b6b992a8c8b14c598d3f7c5513569f7f32486d8cdfb3e9c3" exitCode=0 Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.941341 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.5496359119999998 podStartE2EDuration="1.94132096s" podCreationTimestamp="2026-03-19 09:51:32 +0000 UTC" firstStartedPulling="2026-03-19 09:51:32.906934972 +0000 UTC m=+1747.755533559" lastFinishedPulling="2026-03-19 09:51:33.29862002 +0000 UTC m=+1748.147218607" observedRunningTime="2026-03-19 09:51:33.908917477 +0000 UTC m=+1748.757516054" watchObservedRunningTime="2026-03-19 09:51:33.94132096 +0000 UTC m=+1748.789919547" Mar 19 09:51:33 crc kubenswrapper[4835]: I0319 09:51:33.948386 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.470979089 podStartE2EDuration="1.948368597s" podCreationTimestamp="2026-03-19 09:51:32 +0000 UTC" firstStartedPulling="2026-03-19 09:51:33.062030952 +0000 UTC m=+1747.910629539" lastFinishedPulling="2026-03-19 09:51:33.53942046 +0000 UTC m=+1748.388019047" observedRunningTime="2026-03-19 09:51:33.929962407 +0000 UTC m=+1748.778560994" watchObservedRunningTime="2026-03-19 09:51:33.948368597 +0000 UTC m=+1748.796967184" Mar 19 09:51:34 crc kubenswrapper[4835]: I0319 09:51:34.201328 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:51:34 crc kubenswrapper[4835]: I0319 09:51:34.255281 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:51:34 crc kubenswrapper[4835]: I0319 09:51:34.264400 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 09:51:34 crc kubenswrapper[4835]: I0319 09:51:34.919230 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 09:51:35 crc kubenswrapper[4835]: I0319 09:51:35.034421 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:51:35 crc kubenswrapper[4835]: I0319 09:51:35.034508 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 09:51:35 crc kubenswrapper[4835]: I0319 09:51:35.049196 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:51:35 crc kubenswrapper[4835]: I0319 09:51:35.051020 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 09:51:36 crc kubenswrapper[4835]: I0319 09:51:36.937436 4835 generic.go:334] "Generic (PLEG): container finished" podID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerID="fb4ae9fffc327fd8461a8905f2d3c462ae630d627246cff005daac50646b323a" exitCode=0 Mar 19 09:51:36 crc kubenswrapper[4835]: I0319 09:51:36.939079 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerDied","Data":"fb4ae9fffc327fd8461a8905f2d3c462ae630d627246cff005daac50646b323a"} Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.289808 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.419111 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.419574 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.419848 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.419898 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.420131 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.420245 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.420449 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbfj2\" (UniqueName: \"kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.420503 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.420537 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle\") pod \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\" (UID: \"ad121ad3-667b-4b8a-a2ef-29e24a85c34a\") " Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.421465 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.421483 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.425715 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts" (OuterVolumeSpecName: "scripts") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.430877 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2" (OuterVolumeSpecName: "kube-api-access-rbfj2") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "kube-api-access-rbfj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.469964 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.524560 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.524663 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbfj2\" (UniqueName: \"kubernetes.io/projected/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-kube-api-access-rbfj2\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.524726 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.530936 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.584549 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data" (OuterVolumeSpecName: "config-data") pod "ad121ad3-667b-4b8a-a2ef-29e24a85c34a" (UID: "ad121ad3-667b-4b8a-a2ef-29e24a85c34a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.629528 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.630276 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad121ad3-667b-4b8a-a2ef-29e24a85c34a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.952106 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad121ad3-667b-4b8a-a2ef-29e24a85c34a","Type":"ContainerDied","Data":"586cb92e76279c21ef7b018fa794298224915b1f0e6feab5468ec4fded318b56"} Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.952171 4835 scope.go:117] "RemoveContainer" containerID="c17a1fbebb5fbf5951b8313b8eb7bad23b3847c2ed27a257bd5d75e0c460a2af" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.952234 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.973110 4835 scope.go:117] "RemoveContainer" containerID="9c1ed7172f8fb64afc212c193834b28adb10a130a8519197c976b7e05cc32887" Mar 19 09:51:37 crc kubenswrapper[4835]: I0319 09:51:37.996211 4835 scope.go:117] "RemoveContainer" containerID="fb4ae9fffc327fd8461a8905f2d3c462ae630d627246cff005daac50646b323a" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.009181 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.033837 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.039585 4835 scope.go:117] "RemoveContainer" containerID="1a41f6899c4b9fe9b6b992a8c8b14c598d3f7c5513569f7f32486d8cdfb3e9c3" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.050611 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:38 crc kubenswrapper[4835]: E0319 09:51:38.051318 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-central-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051341 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-central-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: E0319 09:51:38.051393 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="sg-core" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051403 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="sg-core" Mar 19 09:51:38 crc kubenswrapper[4835]: E0319 09:51:38.051424 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-notification-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051434 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-notification-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: E0319 09:51:38.051451 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="proxy-httpd" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051460 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="proxy-httpd" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051732 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="proxy-httpd" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051796 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="sg-core" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051817 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-central-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.051834 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" containerName="ceilometer-notification-agent" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.054718 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.071085 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.076553 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.076881 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.081246 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.244396 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.244555 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.244735 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.244878 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.245097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.245334 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll55p\" (UniqueName: \"kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.245664 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.245728 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.347706 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.347783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.347841 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.347901 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll55p\" (UniqueName: \"kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.347982 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.348010 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.348077 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.348111 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.348613 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.348658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.353034 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.353207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.353626 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.354258 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.357586 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.368218 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll55p\" (UniqueName: \"kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p\") pod \"ceilometer-0\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.419218 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad121ad3-667b-4b8a-a2ef-29e24a85c34a" path="/var/lib/kubelet/pods/ad121ad3-667b-4b8a-a2ef-29e24a85c34a/volumes" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.420021 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.915142 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:51:38 crc kubenswrapper[4835]: W0319 09:51:38.917922 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05a2d577_440d_4c21_a116_b2c0b8a3ea22.slice/crio-8da24a03a0ea7156561b27953a792616bfcfd73100b6d8b28841edad85b95644 WatchSource:0}: Error finding container 8da24a03a0ea7156561b27953a792616bfcfd73100b6d8b28841edad85b95644: Status 404 returned error can't find the container with id 8da24a03a0ea7156561b27953a792616bfcfd73100b6d8b28841edad85b95644 Mar 19 09:51:38 crc kubenswrapper[4835]: I0319 09:51:38.965337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerStarted","Data":"8da24a03a0ea7156561b27953a792616bfcfd73100b6d8b28841edad85b95644"} Mar 19 09:51:39 crc kubenswrapper[4835]: I0319 09:51:39.975977 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerStarted","Data":"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b"} Mar 19 09:51:40 crc kubenswrapper[4835]: I0319 09:51:40.994685 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerStarted","Data":"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0"} Mar 19 09:51:42 crc kubenswrapper[4835]: I0319 09:51:42.016869 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerStarted","Data":"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad"} Mar 19 09:51:42 crc kubenswrapper[4835]: I0319 09:51:42.405286 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:51:42 crc kubenswrapper[4835]: E0319 09:51:42.405676 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:51:42 crc kubenswrapper[4835]: I0319 09:51:42.417990 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 09:51:44 crc kubenswrapper[4835]: I0319 09:51:44.049262 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerStarted","Data":"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded"} Mar 19 09:51:44 crc kubenswrapper[4835]: I0319 09:51:44.049761 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:51:44 crc kubenswrapper[4835]: I0319 09:51:44.085573 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.757184061 podStartE2EDuration="6.085554699s" podCreationTimestamp="2026-03-19 09:51:38 +0000 UTC" firstStartedPulling="2026-03-19 09:51:38.920520367 +0000 UTC m=+1753.769118964" lastFinishedPulling="2026-03-19 09:51:43.248891015 +0000 UTC m=+1758.097489602" observedRunningTime="2026-03-19 09:51:44.08448156 +0000 UTC m=+1758.933080187" watchObservedRunningTime="2026-03-19 09:51:44.085554699 +0000 UTC m=+1758.934153286" Mar 19 09:51:54 crc kubenswrapper[4835]: I0319 09:51:54.402808 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:51:54 crc kubenswrapper[4835]: E0319 09:51:54.403583 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.142651 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565232-7xdjv"] Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.144879 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.146961 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.147292 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.162257 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565232-7xdjv"] Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.205511 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.308442 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt\") pod \"auto-csr-approver-29565232-7xdjv\" (UID: \"5a88ab97-e853-45f0-9c23-61182401b914\") " pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.410359 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt\") pod \"auto-csr-approver-29565232-7xdjv\" (UID: \"5a88ab97-e853-45f0-9c23-61182401b914\") " pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.447504 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt\") pod \"auto-csr-approver-29565232-7xdjv\" (UID: \"5a88ab97-e853-45f0-9c23-61182401b914\") " pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:00 crc kubenswrapper[4835]: I0319 09:52:00.522361 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:01 crc kubenswrapper[4835]: I0319 09:52:01.038416 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565232-7xdjv"] Mar 19 09:52:01 crc kubenswrapper[4835]: W0319 09:52:01.041024 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a88ab97_e853_45f0_9c23_61182401b914.slice/crio-2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a WatchSource:0}: Error finding container 2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a: Status 404 returned error can't find the container with id 2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a Mar 19 09:52:01 crc kubenswrapper[4835]: I0319 09:52:01.043472 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:52:01 crc kubenswrapper[4835]: I0319 09:52:01.240425 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" event={"ID":"5a88ab97-e853-45f0-9c23-61182401b914","Type":"ContainerStarted","Data":"2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a"} Mar 19 09:52:03 crc kubenswrapper[4835]: I0319 09:52:03.261184 4835 generic.go:334] "Generic (PLEG): container finished" podID="5a88ab97-e853-45f0-9c23-61182401b914" containerID="c00c6a6b8c36f4b1827d9a58180745faadbfa9291e59d4f2dd61dd77df2365b1" exitCode=0 Mar 19 09:52:03 crc kubenswrapper[4835]: I0319 09:52:03.261656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" event={"ID":"5a88ab97-e853-45f0-9c23-61182401b914","Type":"ContainerDied","Data":"c00c6a6b8c36f4b1827d9a58180745faadbfa9291e59d4f2dd61dd77df2365b1"} Mar 19 09:52:04 crc kubenswrapper[4835]: I0319 09:52:04.683765 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:04 crc kubenswrapper[4835]: I0319 09:52:04.832971 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt\") pod \"5a88ab97-e853-45f0-9c23-61182401b914\" (UID: \"5a88ab97-e853-45f0-9c23-61182401b914\") " Mar 19 09:52:04 crc kubenswrapper[4835]: I0319 09:52:04.838286 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt" (OuterVolumeSpecName: "kube-api-access-2lzxt") pod "5a88ab97-e853-45f0-9c23-61182401b914" (UID: "5a88ab97-e853-45f0-9c23-61182401b914"). InnerVolumeSpecName "kube-api-access-2lzxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:04 crc kubenswrapper[4835]: I0319 09:52:04.936363 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lzxt\" (UniqueName: \"kubernetes.io/projected/5a88ab97-e853-45f0-9c23-61182401b914-kube-api-access-2lzxt\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:05 crc kubenswrapper[4835]: I0319 09:52:05.284200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" event={"ID":"5a88ab97-e853-45f0-9c23-61182401b914","Type":"ContainerDied","Data":"2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a"} Mar 19 09:52:05 crc kubenswrapper[4835]: I0319 09:52:05.284250 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565232-7xdjv" Mar 19 09:52:05 crc kubenswrapper[4835]: I0319 09:52:05.284250 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bcee43218568e0ad4d33b46ddc37721e46f0a6d2f2249e0860639f5d11a5d7a" Mar 19 09:52:05 crc kubenswrapper[4835]: I0319 09:52:05.756673 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565226-dh25m"] Mar 19 09:52:05 crc kubenswrapper[4835]: I0319 09:52:05.767240 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565226-dh25m"] Mar 19 09:52:06 crc kubenswrapper[4835]: I0319 09:52:06.416587 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9581e3-765f-47c2-817f-233dd6d968fd" path="/var/lib/kubelet/pods/fb9581e3-765f-47c2-817f-233dd6d968fd/volumes" Mar 19 09:52:07 crc kubenswrapper[4835]: I0319 09:52:07.402428 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:52:07 crc kubenswrapper[4835]: E0319 09:52:07.404397 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:08 crc kubenswrapper[4835]: I0319 09:52:08.431963 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.065452 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-4jmpv"] Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.086895 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-4jmpv"] Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.183756 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-ck565"] Mar 19 09:52:20 crc kubenswrapper[4835]: E0319 09:52:20.184668 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a88ab97-e853-45f0-9c23-61182401b914" containerName="oc" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.184691 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a88ab97-e853-45f0-9c23-61182401b914" containerName="oc" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.185045 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a88ab97-e853-45f0-9c23-61182401b914" containerName="oc" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.186070 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.223970 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-ck565"] Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.245949 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.246046 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbfm7\" (UniqueName: \"kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.246126 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.347979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.348088 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbfm7\" (UniqueName: \"kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.348182 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.354163 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.354448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.367241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbfm7\" (UniqueName: \"kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7\") pod \"heat-db-sync-ck565\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " pod="openstack/heat-db-sync-ck565" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.417256 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89dd21d1-34a1-4d91-a7cb-32840eda818e" path="/var/lib/kubelet/pods/89dd21d1-34a1-4d91-a7cb-32840eda818e/volumes" Mar 19 09:52:20 crc kubenswrapper[4835]: I0319 09:52:20.524776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ck565" Mar 19 09:52:21 crc kubenswrapper[4835]: I0319 09:52:21.026676 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-ck565"] Mar 19 09:52:21 crc kubenswrapper[4835]: I0319 09:52:21.402015 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:52:21 crc kubenswrapper[4835]: E0319 09:52:21.402360 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:21 crc kubenswrapper[4835]: I0319 09:52:21.530192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ck565" event={"ID":"9a5db77c-86cf-4661-8fe4-be1b865884a2","Type":"ContainerStarted","Data":"435f5286ffa2f37667a008205e63a1708eb607c34c58e2c58ccb75601cf25c3e"} Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.062515 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.527734 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.528041 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-central-agent" containerID="cri-o://9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b" gracePeriod=30 Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.528474 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="sg-core" containerID="cri-o://6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad" gracePeriod=30 Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.528451 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="proxy-httpd" containerID="cri-o://eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded" gracePeriod=30 Mar 19 09:52:22 crc kubenswrapper[4835]: I0319 09:52:22.528519 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-notification-agent" containerID="cri-o://8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0" gracePeriod=30 Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.312342 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567352 4835 generic.go:334] "Generic (PLEG): container finished" podID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerID="eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded" exitCode=0 Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567722 4835 generic.go:334] "Generic (PLEG): container finished" podID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerID="6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad" exitCode=2 Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567792 4835 generic.go:334] "Generic (PLEG): container finished" podID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerID="9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b" exitCode=0 Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerDied","Data":"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded"} Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567852 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerDied","Data":"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad"} Mar 19 09:52:23 crc kubenswrapper[4835]: I0319 09:52:23.567867 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerDied","Data":"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b"} Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.111235 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.275920 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276013 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276073 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll55p\" (UniqueName: \"kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276247 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276300 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276373 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276403 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276473 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.276533 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml\") pod \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\" (UID: \"05a2d577-440d-4c21-a116-b2c0b8a3ea22\") " Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.277390 4835 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.281275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.291440 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts" (OuterVolumeSpecName: "scripts") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.291540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p" (OuterVolumeSpecName: "kube-api-access-ll55p") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "kube-api-access-ll55p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.370088 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.379292 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.379325 4835 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/05a2d577-440d-4c21-a116-b2c0b8a3ea22-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.379337 4835 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.379350 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll55p\" (UniqueName: \"kubernetes.io/projected/05a2d577-440d-4c21-a116-b2c0b8a3ea22-kube-api-access-ll55p\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.393570 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.457578 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.461828 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data" (OuterVolumeSpecName: "config-data") pod "05a2d577-440d-4c21-a116-b2c0b8a3ea22" (UID: "05a2d577-440d-4c21-a116-b2c0b8a3ea22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.482715 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.482781 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.482796 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a2d577-440d-4c21-a116-b2c0b8a3ea22-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.590524 4835 generic.go:334] "Generic (PLEG): container finished" podID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerID="8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0" exitCode=0 Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.590572 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.590584 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerDied","Data":"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0"} Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.590615 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"05a2d577-440d-4c21-a116-b2c0b8a3ea22","Type":"ContainerDied","Data":"8da24a03a0ea7156561b27953a792616bfcfd73100b6d8b28841edad85b95644"} Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.590632 4835 scope.go:117] "RemoveContainer" containerID="eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.639261 4835 scope.go:117] "RemoveContainer" containerID="6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.644473 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.669178 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.691789 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.692465 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-notification-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692485 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-notification-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.692522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-central-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692530 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-central-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.692544 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="proxy-httpd" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692551 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="proxy-httpd" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.692573 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="sg-core" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692579 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="sg-core" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692851 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="sg-core" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692867 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="proxy-httpd" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692877 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-notification-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.692898 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" containerName="ceilometer-central-agent" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.695007 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.701230 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.701496 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.701632 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.724012 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.780003 4835 scope.go:117] "RemoveContainer" containerID="8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.795568 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.795644 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-scripts\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.795867 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-log-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.796104 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-config-data\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.796252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wfsf\" (UniqueName: \"kubernetes.io/projected/f2687a33-8ba7-4a21-9906-84de49945433-kube-api-access-7wfsf\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.796353 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.796410 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.796813 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-run-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.834974 4835 scope.go:117] "RemoveContainer" containerID="9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.867181 4835 scope.go:117] "RemoveContainer" containerID="eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.867617 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded\": container with ID starting with eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded not found: ID does not exist" containerID="eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.867657 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded"} err="failed to get container status \"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded\": rpc error: code = NotFound desc = could not find container \"eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded\": container with ID starting with eef3cce60984612508585826afafc847fbba101ef69156aac4e6215285b64ded not found: ID does not exist" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.867683 4835 scope.go:117] "RemoveContainer" containerID="6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.868091 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad\": container with ID starting with 6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad not found: ID does not exist" containerID="6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.868115 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad"} err="failed to get container status \"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad\": rpc error: code = NotFound desc = could not find container \"6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad\": container with ID starting with 6d94406a06511b97bf75570b8e908356e6be0b7ed53ac60c9e9bef65c9fbccad not found: ID does not exist" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.868130 4835 scope.go:117] "RemoveContainer" containerID="8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.869460 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0\": container with ID starting with 8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0 not found: ID does not exist" containerID="8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.869485 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0"} err="failed to get container status \"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0\": rpc error: code = NotFound desc = could not find container \"8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0\": container with ID starting with 8ecda1adf1e9f852c333af35ed124d26599f20c864699926809b1821ea3b0da0 not found: ID does not exist" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.869498 4835 scope.go:117] "RemoveContainer" containerID="9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b" Mar 19 09:52:24 crc kubenswrapper[4835]: E0319 09:52:24.870463 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b\": container with ID starting with 9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b not found: ID does not exist" containerID="9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.870489 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b"} err="failed to get container status \"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b\": rpc error: code = NotFound desc = could not find container \"9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b\": container with ID starting with 9080fb70dfc4b73b78c5944931b325904e2bf917d593d1d419564d952f42dc1b not found: ID does not exist" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.899702 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.899783 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-scripts\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.899940 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-log-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.900027 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-config-data\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.900083 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wfsf\" (UniqueName: \"kubernetes.io/projected/f2687a33-8ba7-4a21-9906-84de49945433-kube-api-access-7wfsf\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.900128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.900176 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.900689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-run-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.902008 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-run-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.904536 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.904696 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.904868 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f2687a33-8ba7-4a21-9906-84de49945433-log-httpd\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.905282 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-scripts\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.906415 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.919696 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2687a33-8ba7-4a21-9906-84de49945433-config-data\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:24 crc kubenswrapper[4835]: I0319 09:52:24.929115 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wfsf\" (UniqueName: \"kubernetes.io/projected/f2687a33-8ba7-4a21-9906-84de49945433-kube-api-access-7wfsf\") pod \"ceilometer-0\" (UID: \"f2687a33-8ba7-4a21-9906-84de49945433\") " pod="openstack/ceilometer-0" Mar 19 09:52:25 crc kubenswrapper[4835]: I0319 09:52:25.061585 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 09:52:25 crc kubenswrapper[4835]: I0319 09:52:25.671560 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 09:52:26 crc kubenswrapper[4835]: I0319 09:52:26.425038 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a2d577-440d-4c21-a116-b2c0b8a3ea22" path="/var/lib/kubelet/pods/05a2d577-440d-4c21-a116-b2c0b8a3ea22/volumes" Mar 19 09:52:26 crc kubenswrapper[4835]: I0319 09:52:26.634425 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"3f7bb5f7905b9dcdb04622c30f932abd984a77d7433d4dc7ec91fb522579b9b7"} Mar 19 09:52:26 crc kubenswrapper[4835]: I0319 09:52:26.867420 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" containerID="cri-o://38a3742b4e3223b9426ae4beb4d1a4985646e49408e0f81d4fdca3e372927265" gracePeriod=604796 Mar 19 09:52:29 crc kubenswrapper[4835]: I0319 09:52:29.027269 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" containerID="cri-o://ffb00e2578c5c91c77a01d64a58bb95a80da4d289ffc8ac1338d7ab81af642e6" gracePeriod=604795 Mar 19 09:52:29 crc kubenswrapper[4835]: I0319 09:52:29.708961 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 19 09:52:30 crc kubenswrapper[4835]: I0319 09:52:30.428288 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 19 09:52:33 crc kubenswrapper[4835]: I0319 09:52:33.403431 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:52:33 crc kubenswrapper[4835]: E0319 09:52:33.404701 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:33 crc kubenswrapper[4835]: I0319 09:52:33.764065 4835 generic.go:334] "Generic (PLEG): container finished" podID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerID="38a3742b4e3223b9426ae4beb4d1a4985646e49408e0f81d4fdca3e372927265" exitCode=0 Mar 19 09:52:33 crc kubenswrapper[4835]: I0319 09:52:33.764145 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerDied","Data":"38a3742b4e3223b9426ae4beb4d1a4985646e49408e0f81d4fdca3e372927265"} Mar 19 09:52:35 crc kubenswrapper[4835]: I0319 09:52:35.827220 4835 generic.go:334] "Generic (PLEG): container finished" podID="f773ac48-7b51-427e-9c89-34e515bddabb" containerID="ffb00e2578c5c91c77a01d64a58bb95a80da4d289ffc8ac1338d7ab81af642e6" exitCode=0 Mar 19 09:52:35 crc kubenswrapper[4835]: I0319 09:52:35.827306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerDied","Data":"ffb00e2578c5c91c77a01d64a58bb95a80da4d289ffc8ac1338d7ab81af642e6"} Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.448359 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.450761 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.454076 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.464198 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599016 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599069 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599142 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599215 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599271 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599334 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.599397 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wlr\" (UniqueName: \"kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.701686 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wlr\" (UniqueName: \"kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.701893 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.701930 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.701963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.702036 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.702077 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.702138 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.702920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.702963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.703007 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.703224 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.703340 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.704206 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.734775 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wlr\" (UniqueName: \"kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr\") pod \"dnsmasq-dns-7d84b4d45c-v94x8\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:38 crc kubenswrapper[4835]: I0319 09:52:38.777879 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:39 crc kubenswrapper[4835]: I0319 09:52:39.709057 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 19 09:52:40 crc kubenswrapper[4835]: E0319 09:52:40.295431 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 19 09:52:40 crc kubenswrapper[4835]: E0319 09:52:40.295494 4835 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 19 09:52:40 crc kubenswrapper[4835]: E0319 09:52:40.295638 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbfm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-ck565_openstack(9a5db77c-86cf-4661-8fe4-be1b865884a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 09:52:40 crc kubenswrapper[4835]: E0319 09:52:40.297034 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-ck565" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" Mar 19 09:52:40 crc kubenswrapper[4835]: E0319 09:52:40.897906 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-ck565" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.779988 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891686 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891799 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891831 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891922 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891954 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.891985 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.892622 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.892915 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.892939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.892963 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.892995 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkm2t\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.893234 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.893578 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\" (UID: \"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185\") " Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.893586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.894790 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.894813 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.894824 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.925561 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info" (OuterVolumeSpecName: "pod-info") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.926466 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.929673 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t" (OuterVolumeSpecName: "kube-api-access-dkm2t") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "kube-api-access-dkm2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.933283 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.941213 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6" (OuterVolumeSpecName: "persistence") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.942227 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185","Type":"ContainerDied","Data":"c6f9cf4b80284a932ce720794be9f70ef21bf9a5f2e5b9f1f770215c7b530f0c"} Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.942274 4835 scope.go:117] "RemoveContainer" containerID="38a3742b4e3223b9426ae4beb4d1a4985646e49408e0f81d4fdca3e372927265" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.942419 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.983432 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data" (OuterVolumeSpecName: "config-data") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997594 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997656 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997664 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997675 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997686 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkm2t\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-kube-api-access-dkm2t\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:41 crc kubenswrapper[4835]: I0319 09:52:41.997748 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") on node \"crc\" " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.029311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf" (OuterVolumeSpecName: "server-conf") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.045257 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.045499 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6") on node "crc" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.092292 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" (UID: "eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.100286 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.100327 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.100340 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.303014 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.322320 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.335614 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:42 crc kubenswrapper[4835]: E0319 09:52:42.336135 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="setup-container" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.336156 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="setup-container" Mar 19 09:52:42 crc kubenswrapper[4835]: E0319 09:52:42.336203 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.336209 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.336419 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.337781 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.362285 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.426554 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" path="/var/lib/kubelet/pods/eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185/volumes" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.512288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.512452 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.512512 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.513172 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.513347 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.513389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.514706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-config-data\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.514785 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdsv\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-kube-api-access-9bdsv\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.515167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.515767 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.515944 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.531789 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.614110 4835 scope.go:117] "RemoveContainer" containerID="a019a87ad20e80e5d27e97a78500bd57162d944fdb1518383ddc556356c45145" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.617430 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.618543 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619274 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqn26\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619467 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619551 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619696 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619779 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619861 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619895 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.619947 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf\") pod \"f773ac48-7b51-427e-9c89-34e515bddabb\" (UID: \"f773ac48-7b51-427e-9c89-34e515bddabb\") " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.620900 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.623329 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.623621 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.627508 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.627697 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.628180 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.629365 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.629546 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.630018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info" (OuterVolumeSpecName: "pod-info") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.631070 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.631421 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.631587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.638810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.641896 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.641938 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f8512cd517073f0d6f4dcbd40cfe670029840b44a1b34da330f087592cacc87/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.645244 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.650498 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-pod-info\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.651128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-config-data\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.651185 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdsv\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-kube-api-access-9bdsv\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.651210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.651284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.653988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-server-conf\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.656312 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-config-data\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.656393 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657019 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657573 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f773ac48-7b51-427e-9c89-34e515bddabb-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657609 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657622 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657636 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657647 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.657658 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f773ac48-7b51-427e-9c89-34e515bddabb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.666909 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26" (OuterVolumeSpecName: "kube-api-access-lqn26") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "kube-api-access-lqn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.684921 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.687845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdsv\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-kube-api-access-9bdsv\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.688677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d3ab82b9-82ca-4e95-a3a0-1854edb16b7b-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.770516 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqn26\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-kube-api-access-lqn26\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.782837 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c771df4b-4448-4dbe-888f-46134d9ec4c6\") pod \"rabbitmq-server-2\" (UID: \"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b\") " pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.790470 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data" (OuterVolumeSpecName: "config-data") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.790535 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf" (OuterVolumeSpecName: "server-conf") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.797243 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f" (OuterVolumeSpecName: "persistence") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "pvc-73ddda24-4d68-40ee-afda-4c6872e6269f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.872365 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.873090 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") on node \"crc\" " Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.873187 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f773ac48-7b51-427e-9c89-34e515bddabb-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.960297 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f773ac48-7b51-427e-9c89-34e515bddabb" (UID: "f773ac48-7b51-427e-9c89-34e515bddabb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.962915 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.972173 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.972496 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-73ddda24-4d68-40ee-afda-4c6872e6269f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f") on node "crc" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.976603 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.976639 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f773ac48-7b51-427e-9c89-34e515bddabb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.979434 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f773ac48-7b51-427e-9c89-34e515bddabb","Type":"ContainerDied","Data":"197e9b70bf6f20819e4cdaeab5b67b51287cb78bf25bb687c5b05873f57c96fa"} Mar 19 09:52:42 crc kubenswrapper[4835]: I0319 09:52:42.979575 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.018251 4835 scope.go:117] "RemoveContainer" containerID="ffb00e2578c5c91c77a01d64a58bb95a80da4d289ffc8ac1338d7ab81af642e6" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.110009 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.114614 4835 scope.go:117] "RemoveContainer" containerID="21c03c5c9384d15ca34866e25c6fd7c9610210c0afde02081401ef0199d2e255" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.130098 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.156075 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:43 crc kubenswrapper[4835]: E0319 09:52:43.157494 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="setup-container" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.157515 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="setup-container" Mar 19 09:52:43 crc kubenswrapper[4835]: E0319 09:52:43.157529 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.157538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.157866 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" containerName="rabbitmq" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.160113 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.163019 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.167325 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-z5r7q" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.167606 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.168016 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.168197 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.168288 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.176820 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.178590 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 09:52:43 crc kubenswrapper[4835]: W0319 09:52:43.204943 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21edff65_df53_464d_b17f_caa8d2735a4c.slice/crio-d94f4ee40da5ee680107de593f95e02be5c871cae875169e85a3d27f3ec96952 WatchSource:0}: Error finding container d94f4ee40da5ee680107de593f95e02be5c871cae875169e85a3d27f3ec96952: Status 404 returned error can't find the container with id d94f4ee40da5ee680107de593f95e02be5c871cae875169e85a3d27f3ec96952 Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.223844 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326163 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326598 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326630 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326684 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326791 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326824 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnc96\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-kube-api-access-vnc96\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326867 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.326937 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.327027 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f9fe928-2ee4-486d-a8c6-692169a02f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.327070 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.327146 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f9fe928-2ee4-486d-a8c6-692169a02f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428232 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428296 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428322 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnc96\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-kube-api-access-vnc96\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428349 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428394 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428443 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f9fe928-2ee4-486d-a8c6-692169a02f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428470 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428489 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f9fe928-2ee4-486d-a8c6-692169a02f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428557 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428605 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428629 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.428701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.429024 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.429836 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.430375 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.430610 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f9fe928-2ee4-486d-a8c6-692169a02f42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.430963 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.430992 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65fb8dd4e3e5bb0348221bd046f190ee81283e3e9c70b8d7f16685b3389c205f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.433029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f9fe928-2ee4-486d-a8c6-692169a02f42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.433122 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f9fe928-2ee4-486d-a8c6-692169a02f42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.433655 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.435899 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.443073 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnc96\" (UniqueName: \"kubernetes.io/projected/6f9fe928-2ee4-486d-a8c6-692169a02f42-kube-api-access-vnc96\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.502992 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73ddda24-4d68-40ee-afda-4c6872e6269f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f9fe928-2ee4-486d-a8c6-692169a02f42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.513343 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.593245 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 19 09:52:43 crc kubenswrapper[4835]: W0319 09:52:43.623396 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ab82b9_82ca_4e95_a3a0_1854edb16b7b.slice/crio-0468a409d262e2436fafc627475d4b2c2155b32ab5e5d9087923bbdffe141de1 WatchSource:0}: Error finding container 0468a409d262e2436fafc627475d4b2c2155b32ab5e5d9087923bbdffe141de1: Status 404 returned error can't find the container with id 0468a409d262e2436fafc627475d4b2c2155b32ab5e5d9087923bbdffe141de1 Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.999016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"a37e3111e2b99d99ea470c4f5e1d51c1f65c0bcfbedf6265863290e7a123168a"} Mar 19 09:52:43 crc kubenswrapper[4835]: I0319 09:52:43.999273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"110aa42509437eb4825da61f2dedc7f9dc1d657aa826faf30fd91d9e8f4a59d1"} Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.002968 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b","Type":"ContainerStarted","Data":"0468a409d262e2436fafc627475d4b2c2155b32ab5e5d9087923bbdffe141de1"} Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.004454 4835 generic.go:334] "Generic (PLEG): container finished" podID="21edff65-df53-464d-b17f-caa8d2735a4c" containerID="5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626" exitCode=0 Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.004488 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" event={"ID":"21edff65-df53-464d-b17f-caa8d2735a4c","Type":"ContainerDied","Data":"5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626"} Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.004510 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" event={"ID":"21edff65-df53-464d-b17f-caa8d2735a4c","Type":"ContainerStarted","Data":"d94f4ee40da5ee680107de593f95e02be5c871cae875169e85a3d27f3ec96952"} Mar 19 09:52:44 crc kubenswrapper[4835]: W0319 09:52:44.011402 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9fe928_2ee4_486d_a8c6_692169a02f42.slice/crio-01d9aca873d7ef10d90fa411d66fee88ce36bafdbe4d068cd0ba74b91aa40811 WatchSource:0}: Error finding container 01d9aca873d7ef10d90fa411d66fee88ce36bafdbe4d068cd0ba74b91aa40811: Status 404 returned error can't find the container with id 01d9aca873d7ef10d90fa411d66fee88ce36bafdbe4d068cd0ba74b91aa40811 Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.018958 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 09:52:44 crc kubenswrapper[4835]: I0319 09:52:44.421702 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f773ac48-7b51-427e-9c89-34e515bddabb" path="/var/lib/kubelet/pods/f773ac48-7b51-427e-9c89-34e515bddabb/volumes" Mar 19 09:52:45 crc kubenswrapper[4835]: I0319 09:52:45.017254 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" event={"ID":"21edff65-df53-464d-b17f-caa8d2735a4c","Type":"ContainerStarted","Data":"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d"} Mar 19 09:52:45 crc kubenswrapper[4835]: I0319 09:52:45.017597 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:45 crc kubenswrapper[4835]: I0319 09:52:45.018969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f9fe928-2ee4-486d-a8c6-692169a02f42","Type":"ContainerStarted","Data":"01d9aca873d7ef10d90fa411d66fee88ce36bafdbe4d068cd0ba74b91aa40811"} Mar 19 09:52:45 crc kubenswrapper[4835]: I0319 09:52:45.068470 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" podStartSLOduration=7.068442774 podStartE2EDuration="7.068442774s" podCreationTimestamp="2026-03-19 09:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:45.056201574 +0000 UTC m=+1819.904800161" watchObservedRunningTime="2026-03-19 09:52:45.068442774 +0000 UTC m=+1819.917041371" Mar 19 09:52:45 crc kubenswrapper[4835]: I0319 09:52:45.428983 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="eb64fa11-313c-4a5f-9f8f-0b1f3ebf7185" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: i/o timeout" Mar 19 09:52:46 crc kubenswrapper[4835]: I0319 09:52:46.032447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"831ed57db88dee11af1ef73522ccbbe89e622636ab0e2609faee59924c114d3a"} Mar 19 09:52:46 crc kubenswrapper[4835]: I0319 09:52:46.412945 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:52:46 crc kubenswrapper[4835]: E0319 09:52:46.413326 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:48 crc kubenswrapper[4835]: I0319 09:52:48.059773 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"2876bc1c9fa7e14c6093c48d7a8292f1d708c123d134c93fe19d2b92d2a6ca68"} Mar 19 09:52:48 crc kubenswrapper[4835]: I0319 09:52:48.060337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 09:52:48 crc kubenswrapper[4835]: I0319 09:52:48.062369 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b","Type":"ContainerStarted","Data":"3200830f5c81ee49c292ec8df18a65606719b93a4cc0a20fc03fc610ed5b65fa"} Mar 19 09:52:48 crc kubenswrapper[4835]: I0319 09:52:48.065627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f9fe928-2ee4-486d-a8c6-692169a02f42","Type":"ContainerStarted","Data":"2ebbf5795d38add7bab364c06c63adba1f9c657df4d0aebe2aa451145a796230"} Mar 19 09:52:48 crc kubenswrapper[4835]: I0319 09:52:48.089582 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.638268336 podStartE2EDuration="24.089559477s" podCreationTimestamp="2026-03-19 09:52:24 +0000 UTC" firstStartedPulling="2026-03-19 09:52:25.692710064 +0000 UTC m=+1800.541308651" lastFinishedPulling="2026-03-19 09:52:47.144001205 +0000 UTC m=+1821.992599792" observedRunningTime="2026-03-19 09:52:48.086109885 +0000 UTC m=+1822.934708512" watchObservedRunningTime="2026-03-19 09:52:48.089559477 +0000 UTC m=+1822.938158074" Mar 19 09:52:50 crc kubenswrapper[4835]: I0319 09:52:50.882449 4835 scope.go:117] "RemoveContainer" containerID="df9838883e700d81230f5a239c45931c4f3e81bf7b9efe1db7e3363ab93184eb" Mar 19 09:52:50 crc kubenswrapper[4835]: I0319 09:52:50.932151 4835 scope.go:117] "RemoveContainer" containerID="34f6c9d3b2fab0858fb310e67b666ba734c54affa7c126ef4036c0e119670a7d" Mar 19 09:52:50 crc kubenswrapper[4835]: I0319 09:52:50.974828 4835 scope.go:117] "RemoveContainer" containerID="6f87370d06faf202422cfcf749c8b5a7c7ebf90d15961bba40f1d378d1b30ec7" Mar 19 09:52:51 crc kubenswrapper[4835]: I0319 09:52:51.080190 4835 scope.go:117] "RemoveContainer" containerID="a9a4096a50aaca1e5dd81597596eb85a6d8c157de3485cff6980053ff8d6a8a0" Mar 19 09:52:53 crc kubenswrapper[4835]: I0319 09:52:53.780585 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:52:53 crc kubenswrapper[4835]: I0319 09:52:53.872827 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:52:53 crc kubenswrapper[4835]: I0319 09:52:53.873495 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="dnsmasq-dns" containerID="cri-o://1ae2a04a8a094edc99eb24f5c8dafe349f2fcf34c3a75e7ce6ed42af9bf3ace4" gracePeriod=10 Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.025411 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-ln7sh"] Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.027537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.058217 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-ln7sh"] Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.146042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.146657 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4wx\" (UniqueName: \"kubernetes.io/projected/508dcf13-80b0-4ac1-ba8e-42070d0fe929-kube-api-access-kq4wx\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.146810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.146926 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-config\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.147038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.147113 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.147256 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.156853 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c128061-db8c-4a2a-84a7-38c892394c54" containerID="1ae2a04a8a094edc99eb24f5c8dafe349f2fcf34c3a75e7ce6ed42af9bf3ace4" exitCode=0 Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.156900 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" event={"ID":"0c128061-db8c-4a2a-84a7-38c892394c54","Type":"ContainerDied","Data":"1ae2a04a8a094edc99eb24f5c8dafe349f2fcf34c3a75e7ce6ed42af9bf3ace4"} Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.249844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.249925 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-config\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.249955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.250029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.250213 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.250352 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.250496 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4wx\" (UniqueName: \"kubernetes.io/projected/508dcf13-80b0-4ac1-ba8e-42070d0fe929-kube-api-access-kq4wx\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.251806 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.251863 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-config\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.252000 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.252022 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.252134 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.252348 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/508dcf13-80b0-4ac1-ba8e-42070d0fe929-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.275907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4wx\" (UniqueName: \"kubernetes.io/projected/508dcf13-80b0-4ac1-ba8e-42070d0fe929-kube-api-access-kq4wx\") pod \"dnsmasq-dns-6f6df4f56c-ln7sh\" (UID: \"508dcf13-80b0-4ac1-ba8e-42070d0fe929\") " pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.381554 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.551098 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.661835 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.662221 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.662254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.662329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jr77\" (UniqueName: \"kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.662443 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.662463 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb\") pod \"0c128061-db8c-4a2a-84a7-38c892394c54\" (UID: \"0c128061-db8c-4a2a-84a7-38c892394c54\") " Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.667769 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77" (OuterVolumeSpecName: "kube-api-access-6jr77") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "kube-api-access-6jr77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.745144 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.754296 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.768805 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jr77\" (UniqueName: \"kubernetes.io/projected/0c128061-db8c-4a2a-84a7-38c892394c54-kube-api-access-6jr77\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.768849 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.768863 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.770280 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.784256 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config" (OuterVolumeSpecName: "config") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.809569 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0c128061-db8c-4a2a-84a7-38c892394c54" (UID: "0c128061-db8c-4a2a-84a7-38c892394c54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.871869 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.871908 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.871920 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0c128061-db8c-4a2a-84a7-38c892394c54-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:54 crc kubenswrapper[4835]: I0319 09:52:54.939104 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-ln7sh"] Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.169382 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" event={"ID":"0c128061-db8c-4a2a-84a7-38c892394c54","Type":"ContainerDied","Data":"55d6383525884326a1e3ad8dd5292eef162f6b6c5a5641d2c666b2cd31e3cac0"} Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.169453 4835 scope.go:117] "RemoveContainer" containerID="1ae2a04a8a094edc99eb24f5c8dafe349f2fcf34c3a75e7ce6ed42af9bf3ace4" Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.170142 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.172762 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" event={"ID":"508dcf13-80b0-4ac1-ba8e-42070d0fe929","Type":"ContainerStarted","Data":"c9b5cbc3b20b45ecc26f769d34488a7a8d42e57d4cd41045a10bb6e16710742f"} Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.200222 4835 scope.go:117] "RemoveContainer" containerID="8f44ba384351f457566a7505c99e0f53aebfb47044d3b29d5fe83c9a67b7eb6c" Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.218973 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:52:55 crc kubenswrapper[4835]: I0319 09:52:55.234146 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-52r8x"] Mar 19 09:52:56 crc kubenswrapper[4835]: I0319 09:52:56.197396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ck565" event={"ID":"9a5db77c-86cf-4661-8fe4-be1b865884a2","Type":"ContainerStarted","Data":"fb85c8e07fa0ec0e6ba5c6374a58cef371ebadc9572cfbc79add1e2cb829d836"} Mar 19 09:52:56 crc kubenswrapper[4835]: I0319 09:52:56.201839 4835 generic.go:334] "Generic (PLEG): container finished" podID="508dcf13-80b0-4ac1-ba8e-42070d0fe929" containerID="ab0cb064d61ef1792ebb0468a6dd1079c0ea040db498fe5e49d6e313ac879f1f" exitCode=0 Mar 19 09:52:56 crc kubenswrapper[4835]: I0319 09:52:56.201879 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" event={"ID":"508dcf13-80b0-4ac1-ba8e-42070d0fe929","Type":"ContainerDied","Data":"ab0cb064d61ef1792ebb0468a6dd1079c0ea040db498fe5e49d6e313ac879f1f"} Mar 19 09:52:56 crc kubenswrapper[4835]: I0319 09:52:56.227140 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-ck565" podStartSLOduration=1.67435131 podStartE2EDuration="36.227116478s" podCreationTimestamp="2026-03-19 09:52:20 +0000 UTC" firstStartedPulling="2026-03-19 09:52:21.028223628 +0000 UTC m=+1795.876822215" lastFinishedPulling="2026-03-19 09:52:55.580988796 +0000 UTC m=+1830.429587383" observedRunningTime="2026-03-19 09:52:56.221087226 +0000 UTC m=+1831.069685833" watchObservedRunningTime="2026-03-19 09:52:56.227116478 +0000 UTC m=+1831.075715065" Mar 19 09:52:56 crc kubenswrapper[4835]: I0319 09:52:56.473627 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" path="/var/lib/kubelet/pods/0c128061-db8c-4a2a-84a7-38c892394c54/volumes" Mar 19 09:52:57 crc kubenswrapper[4835]: I0319 09:52:57.214612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" event={"ID":"508dcf13-80b0-4ac1-ba8e-42070d0fe929","Type":"ContainerStarted","Data":"b9e8120a52587d5f50e87278768f6e4fb9a9aa03d3cf817ce98b1553f8f2450f"} Mar 19 09:52:57 crc kubenswrapper[4835]: I0319 09:52:57.215111 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:52:57 crc kubenswrapper[4835]: I0319 09:52:57.235714 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" podStartSLOduration=4.235696943 podStartE2EDuration="4.235696943s" podCreationTimestamp="2026-03-19 09:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:52:57.229481796 +0000 UTC m=+1832.078080413" watchObservedRunningTime="2026-03-19 09:52:57.235696943 +0000 UTC m=+1832.084295530" Mar 19 09:52:58 crc kubenswrapper[4835]: I0319 09:52:58.232683 4835 generic.go:334] "Generic (PLEG): container finished" podID="9a5db77c-86cf-4661-8fe4-be1b865884a2" containerID="fb85c8e07fa0ec0e6ba5c6374a58cef371ebadc9572cfbc79add1e2cb829d836" exitCode=0 Mar 19 09:52:58 crc kubenswrapper[4835]: I0319 09:52:58.232772 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ck565" event={"ID":"9a5db77c-86cf-4661-8fe4-be1b865884a2","Type":"ContainerDied","Data":"fb85c8e07fa0ec0e6ba5c6374a58cef371ebadc9572cfbc79add1e2cb829d836"} Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.403610 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:52:59 crc kubenswrapper[4835]: E0319 09:52:59.403992 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.461899 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7bbf7cf9-52r8x" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.9:5353: i/o timeout" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.704552 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ck565" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.827008 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data\") pod \"9a5db77c-86cf-4661-8fe4-be1b865884a2\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.827097 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbfm7\" (UniqueName: \"kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7\") pod \"9a5db77c-86cf-4661-8fe4-be1b865884a2\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.827179 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle\") pod \"9a5db77c-86cf-4661-8fe4-be1b865884a2\" (UID: \"9a5db77c-86cf-4661-8fe4-be1b865884a2\") " Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.832859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7" (OuterVolumeSpecName: "kube-api-access-wbfm7") pod "9a5db77c-86cf-4661-8fe4-be1b865884a2" (UID: "9a5db77c-86cf-4661-8fe4-be1b865884a2"). InnerVolumeSpecName "kube-api-access-wbfm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.864938 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a5db77c-86cf-4661-8fe4-be1b865884a2" (UID: "9a5db77c-86cf-4661-8fe4-be1b865884a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.930852 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbfm7\" (UniqueName: \"kubernetes.io/projected/9a5db77c-86cf-4661-8fe4-be1b865884a2-kube-api-access-wbfm7\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.930891 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:52:59 crc kubenswrapper[4835]: I0319 09:52:59.957149 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data" (OuterVolumeSpecName: "config-data") pod "9a5db77c-86cf-4661-8fe4-be1b865884a2" (UID: "9a5db77c-86cf-4661-8fe4-be1b865884a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:00 crc kubenswrapper[4835]: I0319 09:53:00.034305 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a5db77c-86cf-4661-8fe4-be1b865884a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:00 crc kubenswrapper[4835]: I0319 09:53:00.255064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-ck565" event={"ID":"9a5db77c-86cf-4661-8fe4-be1b865884a2","Type":"ContainerDied","Data":"435f5286ffa2f37667a008205e63a1708eb607c34c58e2c58ccb75601cf25c3e"} Mar 19 09:53:00 crc kubenswrapper[4835]: I0319 09:53:00.255373 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="435f5286ffa2f37667a008205e63a1708eb607c34c58e2c58ccb75601cf25c3e" Mar 19 09:53:00 crc kubenswrapper[4835]: I0319 09:53:00.255306 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-ck565" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.207323 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6d45648c65-jxdbn"] Mar 19 09:53:01 crc kubenswrapper[4835]: E0319 09:53:01.207926 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="init" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.207945 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="init" Mar 19 09:53:01 crc kubenswrapper[4835]: E0319 09:53:01.207998 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="dnsmasq-dns" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.208006 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="dnsmasq-dns" Mar 19 09:53:01 crc kubenswrapper[4835]: E0319 09:53:01.208017 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" containerName="heat-db-sync" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.208025 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" containerName="heat-db-sync" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.208310 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" containerName="heat-db-sync" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.208346 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c128061-db8c-4a2a-84a7-38c892394c54" containerName="dnsmasq-dns" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.210243 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.231953 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d45648c65-jxdbn"] Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.283920 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7756684d79-4fxz4"] Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.286066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.303106 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7556f74cf6-zw6cb"] Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.305099 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.354099 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7756684d79-4fxz4"] Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.371162 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7556f74cf6-zw6cb"] Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.372572 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-combined-ca-bundle\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.372674 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-combined-ca-bundle\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.372765 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.372829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79cnw\" (UniqueName: \"kubernetes.io/projected/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-kube-api-access-79cnw\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.373003 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data-custom\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.373049 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-internal-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.373403 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cc65\" (UniqueName: \"kubernetes.io/projected/58987da0-27e0-4534-9011-1c1cfd751b68-kube-api-access-5cc65\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.374266 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.374366 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data-custom\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.374436 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-public-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.476957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.477357 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data-custom\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.477388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data-custom\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.477438 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-public-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478294 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-combined-ca-bundle\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478340 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478367 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-combined-ca-bundle\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-combined-ca-bundle\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478518 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478550 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79cnw\" (UniqueName: \"kubernetes.io/projected/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-kube-api-access-79cnw\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478594 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-internal-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478645 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data-custom\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-internal-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478729 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99m6s\" (UniqueName: \"kubernetes.io/projected/abd86360-252d-42c1-bdb5-814ccfca3bac-kube-api-access-99m6s\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478852 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-public-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.478883 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cc65\" (UniqueName: \"kubernetes.io/projected/58987da0-27e0-4534-9011-1c1cfd751b68-kube-api-access-5cc65\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.482867 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-internal-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.483288 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data-custom\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.484224 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-config-data\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.484947 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data-custom\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.485778 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-config-data\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.492373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-combined-ca-bundle\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.492989 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-public-tls-certs\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.495798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58987da0-27e0-4534-9011-1c1cfd751b68-combined-ca-bundle\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.497556 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79cnw\" (UniqueName: \"kubernetes.io/projected/1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa-kube-api-access-79cnw\") pod \"heat-engine-6d45648c65-jxdbn\" (UID: \"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa\") " pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.497683 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cc65\" (UniqueName: \"kubernetes.io/projected/58987da0-27e0-4534-9011-1c1cfd751b68-kube-api-access-5cc65\") pod \"heat-api-7756684d79-4fxz4\" (UID: \"58987da0-27e0-4534-9011-1c1cfd751b68\") " pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.550696 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.588090 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data-custom\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.588881 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-combined-ca-bundle\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.589099 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.590703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-internal-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.592323 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data-custom\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.592825 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99m6s\" (UniqueName: \"kubernetes.io/projected/abd86360-252d-42c1-bdb5-814ccfca3bac-kube-api-access-99m6s\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.593160 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-public-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.595576 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-combined-ca-bundle\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.602807 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-internal-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.609465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-public-tls-certs\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.614591 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99m6s\" (UniqueName: \"kubernetes.io/projected/abd86360-252d-42c1-bdb5-814ccfca3bac-kube-api-access-99m6s\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.615041 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abd86360-252d-42c1-bdb5-814ccfca3bac-config-data\") pod \"heat-cfnapi-7556f74cf6-zw6cb\" (UID: \"abd86360-252d-42c1-bdb5-814ccfca3bac\") " pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.620435 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:01 crc kubenswrapper[4835]: I0319 09:53:01.651709 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:02 crc kubenswrapper[4835]: I0319 09:53:02.144125 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6d45648c65-jxdbn"] Mar 19 09:53:02 crc kubenswrapper[4835]: I0319 09:53:02.309439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7556f74cf6-zw6cb"] Mar 19 09:53:02 crc kubenswrapper[4835]: I0319 09:53:02.309812 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d45648c65-jxdbn" event={"ID":"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa","Type":"ContainerStarted","Data":"520a2f8c57f5500b5ab2a652d439a27226ad6e80b14d4c0b0d2afffbc9899941"} Mar 19 09:53:02 crc kubenswrapper[4835]: W0319 09:53:02.334715 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabd86360_252d_42c1_bdb5_814ccfca3bac.slice/crio-a2c708b03cc08282f27aeaa8f2e368f90b72b2cfaa29006db185ba16bebeea8f WatchSource:0}: Error finding container a2c708b03cc08282f27aeaa8f2e368f90b72b2cfaa29006db185ba16bebeea8f: Status 404 returned error can't find the container with id a2c708b03cc08282f27aeaa8f2e368f90b72b2cfaa29006db185ba16bebeea8f Mar 19 09:53:02 crc kubenswrapper[4835]: I0319 09:53:02.366146 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7756684d79-4fxz4"] Mar 19 09:53:02 crc kubenswrapper[4835]: W0319 09:53:02.368040 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58987da0_27e0_4534_9011_1c1cfd751b68.slice/crio-abf67651edba7140abdd106f97e169d28fb22c79089bffdec098608451e1302b WatchSource:0}: Error finding container abf67651edba7140abdd106f97e169d28fb22c79089bffdec098608451e1302b: Status 404 returned error can't find the container with id abf67651edba7140abdd106f97e169d28fb22c79089bffdec098608451e1302b Mar 19 09:53:03 crc kubenswrapper[4835]: I0319 09:53:03.334070 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6d45648c65-jxdbn" event={"ID":"1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa","Type":"ContainerStarted","Data":"0b2ddb07f878aaef440d6a260b0063e2bfa20aa1dd0642786b693cf96bd43c6a"} Mar 19 09:53:03 crc kubenswrapper[4835]: I0319 09:53:03.334670 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:03 crc kubenswrapper[4835]: I0319 09:53:03.339843 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" event={"ID":"abd86360-252d-42c1-bdb5-814ccfca3bac","Type":"ContainerStarted","Data":"a2c708b03cc08282f27aeaa8f2e368f90b72b2cfaa29006db185ba16bebeea8f"} Mar 19 09:53:03 crc kubenswrapper[4835]: I0319 09:53:03.353254 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7756684d79-4fxz4" event={"ID":"58987da0-27e0-4534-9011-1c1cfd751b68","Type":"ContainerStarted","Data":"abf67651edba7140abdd106f97e169d28fb22c79089bffdec098608451e1302b"} Mar 19 09:53:03 crc kubenswrapper[4835]: I0319 09:53:03.353660 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6d45648c65-jxdbn" podStartSLOduration=2.353644796 podStartE2EDuration="2.353644796s" podCreationTimestamp="2026-03-19 09:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:03.353506502 +0000 UTC m=+1838.202105089" watchObservedRunningTime="2026-03-19 09:53:03.353644796 +0000 UTC m=+1838.202243383" Mar 19 09:53:04 crc kubenswrapper[4835]: I0319 09:53:04.383109 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-ln7sh" Mar 19 09:53:04 crc kubenswrapper[4835]: I0319 09:53:04.518488 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:53:04 crc kubenswrapper[4835]: I0319 09:53:04.520676 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="dnsmasq-dns" containerID="cri-o://2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d" gracePeriod=10 Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.113636 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.265629 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6wlr\" (UniqueName: \"kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.265844 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.265939 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.266020 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.266051 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.266091 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.266145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc\") pod \"21edff65-df53-464d-b17f-caa8d2735a4c\" (UID: \"21edff65-df53-464d-b17f-caa8d2735a4c\") " Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.284676 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr" (OuterVolumeSpecName: "kube-api-access-k6wlr") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "kube-api-access-k6wlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.371518 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6wlr\" (UniqueName: \"kubernetes.io/projected/21edff65-df53-464d-b17f-caa8d2735a4c-kube-api-access-k6wlr\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.409363 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7756684d79-4fxz4" event={"ID":"58987da0-27e0-4534-9011-1c1cfd751b68","Type":"ContainerStarted","Data":"06b6dc1bd9e5f9308caf02c0821331dc104d1ee815f355dc352570006784b309"} Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.432366 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" event={"ID":"abd86360-252d-42c1-bdb5-814ccfca3bac","Type":"ContainerStarted","Data":"78c1f80abe3e44983509195b6de6b096df483e03db19b03e700950ea80f1cda3"} Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.433484 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.450352 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7756684d79-4fxz4" podStartSLOduration=2.5011213 podStartE2EDuration="4.450333655s" podCreationTimestamp="2026-03-19 09:53:01 +0000 UTC" firstStartedPulling="2026-03-19 09:53:02.371191712 +0000 UTC m=+1837.219790299" lastFinishedPulling="2026-03-19 09:53:04.320404057 +0000 UTC m=+1839.169002654" observedRunningTime="2026-03-19 09:53:05.44716982 +0000 UTC m=+1840.295768427" watchObservedRunningTime="2026-03-19 09:53:05.450333655 +0000 UTC m=+1840.298932232" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.482449 4835 generic.go:334] "Generic (PLEG): container finished" podID="21edff65-df53-464d-b17f-caa8d2735a4c" containerID="2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d" exitCode=0 Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.482496 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" event={"ID":"21edff65-df53-464d-b17f-caa8d2735a4c","Type":"ContainerDied","Data":"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d"} Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.482522 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" event={"ID":"21edff65-df53-464d-b17f-caa8d2735a4c","Type":"ContainerDied","Data":"d94f4ee40da5ee680107de593f95e02be5c871cae875169e85a3d27f3ec96952"} Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.482538 4835 scope.go:117] "RemoveContainer" containerID="2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.482699 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-v94x8" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.502127 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.502987 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.516871 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" podStartSLOduration=2.532228037 podStartE2EDuration="4.516850394s" podCreationTimestamp="2026-03-19 09:53:01 +0000 UTC" firstStartedPulling="2026-03-19 09:53:02.337008663 +0000 UTC m=+1837.185607250" lastFinishedPulling="2026-03-19 09:53:04.32163102 +0000 UTC m=+1839.170229607" observedRunningTime="2026-03-19 09:53:05.481890764 +0000 UTC m=+1840.330489351" watchObservedRunningTime="2026-03-19 09:53:05.516850394 +0000 UTC m=+1840.365448981" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.517714 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config" (OuterVolumeSpecName: "config") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.524552 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.537550 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.543618 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21edff65-df53-464d-b17f-caa8d2735a4c" (UID: "21edff65-df53-464d-b17f-caa8d2735a4c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579581 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579617 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579626 4835 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579636 4835 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-config\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579647 4835 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.579657 4835 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21edff65-df53-464d-b17f-caa8d2735a4c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.663617 4835 scope.go:117] "RemoveContainer" containerID="5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.723273 4835 scope.go:117] "RemoveContainer" containerID="2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d" Mar 19 09:53:05 crc kubenswrapper[4835]: E0319 09:53:05.725429 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d\": container with ID starting with 2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d not found: ID does not exist" containerID="2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.725474 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d"} err="failed to get container status \"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d\": rpc error: code = NotFound desc = could not find container \"2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d\": container with ID starting with 2e755264b58933ddf9c1c2c1d2a1582bc30553d968ee2611e11f648599b2c68d not found: ID does not exist" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.725503 4835 scope.go:117] "RemoveContainer" containerID="5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626" Mar 19 09:53:05 crc kubenswrapper[4835]: E0319 09:53:05.728837 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626\": container with ID starting with 5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626 not found: ID does not exist" containerID="5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.728900 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626"} err="failed to get container status \"5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626\": rpc error: code = NotFound desc = could not find container \"5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626\": container with ID starting with 5fe9b67bdd38670c1bf425363d8129b9618de6cd46b8e69eaac40ca043792626 not found: ID does not exist" Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.829758 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:53:05 crc kubenswrapper[4835]: I0319 09:53:05.841355 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-v94x8"] Mar 19 09:53:06 crc kubenswrapper[4835]: I0319 09:53:06.420331 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" path="/var/lib/kubelet/pods/21edff65-df53-464d-b17f-caa8d2735a4c/volumes" Mar 19 09:53:06 crc kubenswrapper[4835]: I0319 09:53:06.503302 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:11 crc kubenswrapper[4835]: I0319 09:53:11.403063 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:53:11 crc kubenswrapper[4835]: E0319 09:53:11.403920 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.475131 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7756684d79-4fxz4" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.619532 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.619753 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-755fb9d4bd-8w5k4" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerName="heat-api" containerID="cri-o://ec46e09d6987c2a1cfe64f2d24b39aab1fe1eb7df289d317198d33e7835d6c82" gracePeriod=60 Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.864681 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p"] Mar 19 09:53:13 crc kubenswrapper[4835]: E0319 09:53:13.865582 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="dnsmasq-dns" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.865597 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="dnsmasq-dns" Mar 19 09:53:13 crc kubenswrapper[4835]: E0319 09:53:13.865615 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="init" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.865621 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="init" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.865857 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="21edff65-df53-464d-b17f-caa8d2735a4c" containerName="dnsmasq-dns" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.866681 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.870517 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.870769 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.870881 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.871106 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.879032 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p"] Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.933333 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjlk\" (UniqueName: \"kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.933456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.933521 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:13 crc kubenswrapper[4835]: I0319 09:53:13.933697 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.042247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.042321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.042417 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.042480 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjlk\" (UniqueName: \"kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.049458 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.050887 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.051373 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.061170 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjlk\" (UniqueName: \"kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.203209 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.274003 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7556f74cf6-zw6cb" Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.340303 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:53:14 crc kubenswrapper[4835]: I0319 09:53:14.340507 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerName="heat-cfnapi" containerID="cri-o://71d535ca63306003f9de842e9e7ab4f21245d9aa58264e239d5e78e59c9d9900" gracePeriod=60 Mar 19 09:53:15 crc kubenswrapper[4835]: W0319 09:53:15.166081 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd23d4d_9b8c_4be8_a671_cefa01a4e341.slice/crio-85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931 WatchSource:0}: Error finding container 85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931: Status 404 returned error can't find the container with id 85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931 Mar 19 09:53:15 crc kubenswrapper[4835]: I0319 09:53:15.172712 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p"] Mar 19 09:53:15 crc kubenswrapper[4835]: I0319 09:53:15.710413 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" event={"ID":"2fd23d4d-9b8c-4be8-a671-cefa01a4e341","Type":"ContainerStarted","Data":"85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931"} Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.274977 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-755fb9d4bd-8w5k4" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.229:8004/healthcheck\": read tcp 10.217.0.2:41860->10.217.0.229:8004: read: connection reset by peer" Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.500150 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.228:8000/healthcheck\": read tcp 10.217.0.2:50496->10.217.0.228:8000: read: connection reset by peer" Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.787885 4835 generic.go:334] "Generic (PLEG): container finished" podID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerID="ec46e09d6987c2a1cfe64f2d24b39aab1fe1eb7df289d317198d33e7835d6c82" exitCode=0 Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.787954 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755fb9d4bd-8w5k4" event={"ID":"e53a7466-d485-484b-aed3-0ce3b20b27ce","Type":"ContainerDied","Data":"ec46e09d6987c2a1cfe64f2d24b39aab1fe1eb7df289d317198d33e7835d6c82"} Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.796081 4835 generic.go:334] "Generic (PLEG): container finished" podID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerID="71d535ca63306003f9de842e9e7ab4f21245d9aa58264e239d5e78e59c9d9900" exitCode=0 Mar 19 09:53:17 crc kubenswrapper[4835]: I0319 09:53:17.796136 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" event={"ID":"4384c6fc-0c55-44d3-98b9-b373a66050ee","Type":"ContainerDied","Data":"71d535ca63306003f9de842e9e7ab4f21245d9aa58264e239d5e78e59c9d9900"} Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.007346 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.176951 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.177040 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.177070 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwq79\" (UniqueName: \"kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.177126 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.177146 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.177184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom\") pod \"e53a7466-d485-484b-aed3-0ce3b20b27ce\" (UID: \"e53a7466-d485-484b-aed3-0ce3b20b27ce\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.179684 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.183509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79" (OuterVolumeSpecName: "kube-api-access-kwq79") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "kube-api-access-kwq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.183902 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.239471 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281390 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281619 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281710 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281835 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8ml\" (UniqueName: \"kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281860 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.281895 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.282584 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.282598 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwq79\" (UniqueName: \"kubernetes.io/projected/e53a7466-d485-484b-aed3-0ce3b20b27ce-kube-api-access-kwq79\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.282608 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.283433 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.285879 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.292825 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml" (OuterVolumeSpecName: "kube-api-access-9v8ml") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "kube-api-access-9v8ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.312568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.320332 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.333138 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data" (OuterVolumeSpecName: "config-data") pod "e53a7466-d485-484b-aed3-0ce3b20b27ce" (UID: "e53a7466-d485-484b-aed3-0ce3b20b27ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.361656 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.372969 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.383526 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data" (OuterVolumeSpecName: "config-data") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.383826 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") pod \"4384c6fc-0c55-44d3-98b9-b373a66050ee\" (UID: \"4384c6fc-0c55-44d3-98b9-b373a66050ee\") " Mar 19 09:53:18 crc kubenswrapper[4835]: W0319 09:53:18.384275 4835 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4384c6fc-0c55-44d3-98b9-b373a66050ee/volumes/kubernetes.io~secret/config-data Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.384288 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data" (OuterVolumeSpecName: "config-data") pod "4384c6fc-0c55-44d3-98b9-b373a66050ee" (UID: "4384c6fc-0c55-44d3-98b9-b373a66050ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385662 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385694 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385709 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385722 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385734 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8ml\" (UniqueName: \"kubernetes.io/projected/4384c6fc-0c55-44d3-98b9-b373a66050ee-kube-api-access-9v8ml\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385762 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385773 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385784 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4384c6fc-0c55-44d3-98b9-b373a66050ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.385795 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53a7466-d485-484b-aed3-0ce3b20b27ce-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.810410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" event={"ID":"4384c6fc-0c55-44d3-98b9-b373a66050ee","Type":"ContainerDied","Data":"75830308695b4afcc43dbea56b559fd521b3ced17b9087bb0f2095a4e63d74a6"} Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.810812 4835 scope.go:117] "RemoveContainer" containerID="71d535ca63306003f9de842e9e7ab4f21245d9aa58264e239d5e78e59c9d9900" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.810423 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-69459bdfc9-h9kps" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.815611 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-755fb9d4bd-8w5k4" event={"ID":"e53a7466-d485-484b-aed3-0ce3b20b27ce","Type":"ContainerDied","Data":"db80ea38094234fd345a093ee534a843c12225e627eb1b73270f32d94a45db29"} Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.815705 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-755fb9d4bd-8w5k4" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.849583 4835 scope.go:117] "RemoveContainer" containerID="ec46e09d6987c2a1cfe64f2d24b39aab1fe1eb7df289d317198d33e7835d6c82" Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.858449 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.872670 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-69459bdfc9-h9kps"] Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.887958 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:53:18 crc kubenswrapper[4835]: I0319 09:53:18.901688 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-755fb9d4bd-8w5k4"] Mar 19 09:53:19 crc kubenswrapper[4835]: I0319 09:53:19.827963 4835 generic.go:334] "Generic (PLEG): container finished" podID="6f9fe928-2ee4-486d-a8c6-692169a02f42" containerID="2ebbf5795d38add7bab364c06c63adba1f9c657df4d0aebe2aa451145a796230" exitCode=0 Mar 19 09:53:19 crc kubenswrapper[4835]: I0319 09:53:19.828063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f9fe928-2ee4-486d-a8c6-692169a02f42","Type":"ContainerDied","Data":"2ebbf5795d38add7bab364c06c63adba1f9c657df4d0aebe2aa451145a796230"} Mar 19 09:53:19 crc kubenswrapper[4835]: I0319 09:53:19.831618 4835 generic.go:334] "Generic (PLEG): container finished" podID="d3ab82b9-82ca-4e95-a3a0-1854edb16b7b" containerID="3200830f5c81ee49c292ec8df18a65606719b93a4cc0a20fc03fc610ed5b65fa" exitCode=0 Mar 19 09:53:19 crc kubenswrapper[4835]: I0319 09:53:19.831699 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b","Type":"ContainerDied","Data":"3200830f5c81ee49c292ec8df18a65606719b93a4cc0a20fc03fc610ed5b65fa"} Mar 19 09:53:20 crc kubenswrapper[4835]: I0319 09:53:20.414809 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" path="/var/lib/kubelet/pods/4384c6fc-0c55-44d3-98b9-b373a66050ee/volumes" Mar 19 09:53:20 crc kubenswrapper[4835]: I0319 09:53:20.415660 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" path="/var/lib/kubelet/pods/e53a7466-d485-484b-aed3-0ce3b20b27ce/volumes" Mar 19 09:53:21 crc kubenswrapper[4835]: I0319 09:53:21.597168 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6d45648c65-jxdbn" Mar 19 09:53:21 crc kubenswrapper[4835]: I0319 09:53:21.677584 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:53:21 crc kubenswrapper[4835]: I0319 09:53:21.678430 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-956b854d8-kqpb6" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" containerID="cri-o://3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" gracePeriod=60 Mar 19 09:53:23 crc kubenswrapper[4835]: I0319 09:53:23.402282 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:53:23 crc kubenswrapper[4835]: E0319 09:53:23.402709 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:53:24 crc kubenswrapper[4835]: E0319 09:53:24.198370 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:24 crc kubenswrapper[4835]: E0319 09:53:24.200794 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:24 crc kubenswrapper[4835]: E0319 09:53:24.203364 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:24 crc kubenswrapper[4835]: E0319 09:53:24.203552 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-956b854d8-kqpb6" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" Mar 19 09:53:25 crc kubenswrapper[4835]: I0319 09:53:25.090187 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 09:53:26 crc kubenswrapper[4835]: I0319 09:53:26.583343 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:53:27 crc kubenswrapper[4835]: I0319 09:53:27.931891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" event={"ID":"2fd23d4d-9b8c-4be8-a671-cefa01a4e341","Type":"ContainerStarted","Data":"a1be01e72e0bcb5730b8f58bb529c5d6a6d603f5e8fd5d6bcf171261effcf078"} Mar 19 09:53:27 crc kubenswrapper[4835]: I0319 09:53:27.935554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f9fe928-2ee4-486d-a8c6-692169a02f42","Type":"ContainerStarted","Data":"faf574a16addc9fed8d36f68604bcb53b7d254dc8a9ea3d8eb8cc93ad9f1d678"} Mar 19 09:53:27 crc kubenswrapper[4835]: I0319 09:53:27.936187 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:53:27 crc kubenswrapper[4835]: I0319 09:53:27.939004 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"d3ab82b9-82ca-4e95-a3a0-1854edb16b7b","Type":"ContainerStarted","Data":"17db44159edb12b5f7ba27918959fe4ce860198464b12ad0f2150d7c55734dbf"} Mar 19 09:53:27 crc kubenswrapper[4835]: I0319 09:53:27.939187 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 19 09:53:28 crc kubenswrapper[4835]: I0319 09:53:28.047870 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.047852897 podStartE2EDuration="45.047852897s" podCreationTimestamp="2026-03-19 09:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:28.042200105 +0000 UTC m=+1862.890798692" watchObservedRunningTime="2026-03-19 09:53:28.047852897 +0000 UTC m=+1862.896451484" Mar 19 09:53:28 crc kubenswrapper[4835]: I0319 09:53:28.048656 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" podStartSLOduration=3.63838027 podStartE2EDuration="15.048650059s" podCreationTimestamp="2026-03-19 09:53:13 +0000 UTC" firstStartedPulling="2026-03-19 09:53:15.16845574 +0000 UTC m=+1850.017054327" lastFinishedPulling="2026-03-19 09:53:26.578725529 +0000 UTC m=+1861.427324116" observedRunningTime="2026-03-19 09:53:27.994865583 +0000 UTC m=+1862.843464190" watchObservedRunningTime="2026-03-19 09:53:28.048650059 +0000 UTC m=+1862.897248646" Mar 19 09:53:28 crc kubenswrapper[4835]: I0319 09:53:28.087244 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=46.087219956 podStartE2EDuration="46.087219956s" podCreationTimestamp="2026-03-19 09:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:53:28.074354439 +0000 UTC m=+1862.922953026" watchObservedRunningTime="2026-03-19 09:53:28.087219956 +0000 UTC m=+1862.935818543" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.294794 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8xr9d"] Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.306491 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8xr9d"] Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.396314 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-tp7ss"] Mar 19 09:53:30 crc kubenswrapper[4835]: E0319 09:53:30.397228 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerName="heat-cfnapi" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.397254 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerName="heat-cfnapi" Mar 19 09:53:30 crc kubenswrapper[4835]: E0319 09:53:30.397310 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerName="heat-api" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.397322 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerName="heat-api" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.397723 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4384c6fc-0c55-44d3-98b9-b373a66050ee" containerName="heat-cfnapi" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.397819 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53a7466-d485-484b-aed3-0ce3b20b27ce" containerName="heat-api" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.399235 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.402064 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.420569 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e79bbfb4-e971-492f-95b6-fdf2cb5df595" path="/var/lib/kubelet/pods/e79bbfb4-e971-492f-95b6-fdf2cb5df595/volumes" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.421161 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-tp7ss"] Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.525578 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.530910 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lnp\" (UniqueName: \"kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.531668 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.531825 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.636264 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.636316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lnp\" (UniqueName: \"kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.636418 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.636481 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.642672 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.643764 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.644689 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.655469 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lnp\" (UniqueName: \"kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp\") pod \"aodh-db-sync-tp7ss\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:30 crc kubenswrapper[4835]: I0319 09:53:30.729405 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:31 crc kubenswrapper[4835]: W0319 09:53:31.326582 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd23476b0_3b28_4a6f_84ec_4fcd3b3463cb.slice/crio-06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273 WatchSource:0}: Error finding container 06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273: Status 404 returned error can't find the container with id 06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273 Mar 19 09:53:31 crc kubenswrapper[4835]: I0319 09:53:31.328510 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-tp7ss"] Mar 19 09:53:31 crc kubenswrapper[4835]: I0319 09:53:31.987422 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tp7ss" event={"ID":"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb","Type":"ContainerStarted","Data":"06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273"} Mar 19 09:53:34 crc kubenswrapper[4835]: E0319 09:53:34.192328 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:34 crc kubenswrapper[4835]: E0319 09:53:34.194485 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:34 crc kubenswrapper[4835]: E0319 09:53:34.196854 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 19 09:53:34 crc kubenswrapper[4835]: E0319 09:53:34.196891 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-956b854d8-kqpb6" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" Mar 19 09:53:38 crc kubenswrapper[4835]: I0319 09:53:38.053650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tp7ss" event={"ID":"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb","Type":"ContainerStarted","Data":"67afa5d789dfed720e9368c56b5bd57a4ebd6e51db53c397188a4b44a0ee8b5b"} Mar 19 09:53:38 crc kubenswrapper[4835]: I0319 09:53:38.080419 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-tp7ss" podStartSLOduration=1.817797923 podStartE2EDuration="8.080401876s" podCreationTimestamp="2026-03-19 09:53:30 +0000 UTC" firstStartedPulling="2026-03-19 09:53:31.329346371 +0000 UTC m=+1866.177944958" lastFinishedPulling="2026-03-19 09:53:37.591950324 +0000 UTC m=+1872.440548911" observedRunningTime="2026-03-19 09:53:38.068183667 +0000 UTC m=+1872.916782254" watchObservedRunningTime="2026-03-19 09:53:38.080401876 +0000 UTC m=+1872.929000463" Mar 19 09:53:38 crc kubenswrapper[4835]: I0319 09:53:38.403514 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:53:38 crc kubenswrapper[4835]: E0319 09:53:38.404178 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.074931 4835 generic.go:334] "Generic (PLEG): container finished" podID="2fd23d4d-9b8c-4be8-a671-cefa01a4e341" containerID="a1be01e72e0bcb5730b8f58bb529c5d6a6d603f5e8fd5d6bcf171261effcf078" exitCode=0 Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.075024 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" event={"ID":"2fd23d4d-9b8c-4be8-a671-cefa01a4e341","Type":"ContainerDied","Data":"a1be01e72e0bcb5730b8f58bb529c5d6a6d603f5e8fd5d6bcf171261effcf078"} Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.105269 4835 generic.go:334] "Generic (PLEG): container finished" podID="be54447e-7aac-4db8-94fc-f10aa4661921" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" exitCode=0 Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.105337 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-956b854d8-kqpb6" event={"ID":"be54447e-7aac-4db8-94fc-f10aa4661921","Type":"ContainerDied","Data":"3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e"} Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.456264 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.568396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thbl9\" (UniqueName: \"kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9\") pod \"be54447e-7aac-4db8-94fc-f10aa4661921\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.568460 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle\") pod \"be54447e-7aac-4db8-94fc-f10aa4661921\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.568824 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data\") pod \"be54447e-7aac-4db8-94fc-f10aa4661921\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.568878 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom\") pod \"be54447e-7aac-4db8-94fc-f10aa4661921\" (UID: \"be54447e-7aac-4db8-94fc-f10aa4661921\") " Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.576646 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9" (OuterVolumeSpecName: "kube-api-access-thbl9") pod "be54447e-7aac-4db8-94fc-f10aa4661921" (UID: "be54447e-7aac-4db8-94fc-f10aa4661921"). InnerVolumeSpecName "kube-api-access-thbl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.577869 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be54447e-7aac-4db8-94fc-f10aa4661921" (UID: "be54447e-7aac-4db8-94fc-f10aa4661921"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.603241 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be54447e-7aac-4db8-94fc-f10aa4661921" (UID: "be54447e-7aac-4db8-94fc-f10aa4661921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.645451 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data" (OuterVolumeSpecName: "config-data") pod "be54447e-7aac-4db8-94fc-f10aa4661921" (UID: "be54447e-7aac-4db8-94fc-f10aa4661921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.671562 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.671609 4835 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.671628 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thbl9\" (UniqueName: \"kubernetes.io/projected/be54447e-7aac-4db8-94fc-f10aa4661921-kube-api-access-thbl9\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:39 crc kubenswrapper[4835]: I0319 09:53:39.671640 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be54447e-7aac-4db8-94fc-f10aa4661921-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.118871 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-956b854d8-kqpb6" event={"ID":"be54447e-7aac-4db8-94fc-f10aa4661921","Type":"ContainerDied","Data":"987539963d6ed4a145e4dae3341910ef0609e1b928191d75c1252637409459f4"} Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.119254 4835 scope.go:117] "RemoveContainer" containerID="3db517c22870cf48fc294f690df1329b56ece79b0130a36b23aa4870ce17121e" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.118919 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-956b854d8-kqpb6" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.177309 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.207108 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-956b854d8-kqpb6"] Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.430683 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" path="/var/lib/kubelet/pods/be54447e-7aac-4db8-94fc-f10aa4661921/volumes" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.686075 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.795624 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfjlk\" (UniqueName: \"kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk\") pod \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.795981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle\") pod \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.796061 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam\") pod \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.796161 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory\") pod \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\" (UID: \"2fd23d4d-9b8c-4be8-a671-cefa01a4e341\") " Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.802813 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk" (OuterVolumeSpecName: "kube-api-access-lfjlk") pod "2fd23d4d-9b8c-4be8-a671-cefa01a4e341" (UID: "2fd23d4d-9b8c-4be8-a671-cefa01a4e341"). InnerVolumeSpecName "kube-api-access-lfjlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.803324 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2fd23d4d-9b8c-4be8-a671-cefa01a4e341" (UID: "2fd23d4d-9b8c-4be8-a671-cefa01a4e341"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.837586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2fd23d4d-9b8c-4be8-a671-cefa01a4e341" (UID: "2fd23d4d-9b8c-4be8-a671-cefa01a4e341"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.851069 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory" (OuterVolumeSpecName: "inventory") pod "2fd23d4d-9b8c-4be8-a671-cefa01a4e341" (UID: "2fd23d4d-9b8c-4be8-a671-cefa01a4e341"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.899383 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.899426 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.899444 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:40 crc kubenswrapper[4835]: I0319 09:53:40.899456 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfjlk\" (UniqueName: \"kubernetes.io/projected/2fd23d4d-9b8c-4be8-a671-cefa01a4e341-kube-api-access-lfjlk\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.138201 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.139018 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p" event={"ID":"2fd23d4d-9b8c-4be8-a671-cefa01a4e341","Type":"ContainerDied","Data":"85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931"} Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.139064 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ad32ce0110fb1ec018f8e14a556bc0ccaea87c6ed325126a3829e089962931" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.152270 4835 generic.go:334] "Generic (PLEG): container finished" podID="d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" containerID="67afa5d789dfed720e9368c56b5bd57a4ebd6e51db53c397188a4b44a0ee8b5b" exitCode=0 Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.152310 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tp7ss" event={"ID":"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb","Type":"ContainerDied","Data":"67afa5d789dfed720e9368c56b5bd57a4ebd6e51db53c397188a4b44a0ee8b5b"} Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.177458 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n"] Mar 19 09:53:41 crc kubenswrapper[4835]: E0319 09:53:41.177987 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd23d4d-9b8c-4be8-a671-cefa01a4e341" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.178006 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd23d4d-9b8c-4be8-a671-cefa01a4e341" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:41 crc kubenswrapper[4835]: E0319 09:53:41.178065 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.178072 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.178510 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd23d4d-9b8c-4be8-a671-cefa01a4e341" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.178544 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="be54447e-7aac-4db8-94fc-f10aa4661921" containerName="heat-engine" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.179409 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.182144 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.185770 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.185880 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.186122 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.190441 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n"] Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.307858 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-828qd\" (UniqueName: \"kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.307940 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.308549 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.410419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.410860 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-828qd\" (UniqueName: \"kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.410915 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.414491 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.419357 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.435029 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-828qd\" (UniqueName: \"kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-mhc4n\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:41 crc kubenswrapper[4835]: I0319 09:53:41.497807 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.230646 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n"] Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.646146 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.767396 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6lnp\" (UniqueName: \"kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp\") pod \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.767451 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts\") pod \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.767608 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle\") pod \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.767666 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data\") pod \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\" (UID: \"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb\") " Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.776483 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp" (OuterVolumeSpecName: "kube-api-access-z6lnp") pod "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" (UID: "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb"). InnerVolumeSpecName "kube-api-access-z6lnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.781782 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts" (OuterVolumeSpecName: "scripts") pod "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" (UID: "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.821998 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" (UID: "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.823145 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data" (OuterVolumeSpecName: "config-data") pod "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" (UID: "d23476b0-3b28-4a6f-84ec-4fcd3b3463cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.874071 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6lnp\" (UniqueName: \"kubernetes.io/projected/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-kube-api-access-z6lnp\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.874538 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.874621 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.874707 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:42 crc kubenswrapper[4835]: I0319 09:53:42.966990 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.039441 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.194502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" event={"ID":"a0976232-95f7-45c1-8025-5adf6861e278","Type":"ContainerStarted","Data":"acc521056eaf4174f0f0b62652ea0b9ad44ddc445449f845527626e65933a493"} Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.194558 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" event={"ID":"a0976232-95f7-45c1-8025-5adf6861e278","Type":"ContainerStarted","Data":"9fa3f492944e6234b7d58bf591c44bce9eee49a66f6791a07d700cf2f5ac3bce"} Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.202570 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-tp7ss" event={"ID":"d23476b0-3b28-4a6f-84ec-4fcd3b3463cb","Type":"ContainerDied","Data":"06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273"} Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.202610 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06e4db25c29c724d3b9a98b813cfc770803a07aa82fbb52b6387d4956a940273" Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.202681 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-tp7ss" Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.239887 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" podStartSLOduration=1.71951431 podStartE2EDuration="2.23986257s" podCreationTimestamp="2026-03-19 09:53:41 +0000 UTC" firstStartedPulling="2026-03-19 09:53:42.246115392 +0000 UTC m=+1877.094713979" lastFinishedPulling="2026-03-19 09:53:42.766463652 +0000 UTC m=+1877.615062239" observedRunningTime="2026-03-19 09:53:43.223089519 +0000 UTC m=+1878.071688106" watchObservedRunningTime="2026-03-19 09:53:43.23986257 +0000 UTC m=+1878.088461157" Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.427527 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.427891 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-api" containerID="cri-o://a5a1af55b1fa7fb11130cc41302d05f94330b6621f6b0a878733eb89d5a76b8b" gracePeriod=30 Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.428269 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-notifier" containerID="cri-o://0abec8912e1e3cf610978870c8382031d5df50d077218567a796c1f72aff6f06" gracePeriod=30 Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.428401 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-listener" containerID="cri-o://e1f5e7c55c5ae5d9b0f812f5e32b130ace81f80ae20441972c84e26135e992a0" gracePeriod=30 Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.428418 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-evaluator" containerID="cri-o://0a792066ff864b6c9eadfa0d11e82cba678fa3742c38680dcd693050695f7d86" gracePeriod=30 Mar 19 09:53:43 crc kubenswrapper[4835]: I0319 09:53:43.519257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 09:53:44 crc kubenswrapper[4835]: I0319 09:53:44.216980 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerID="0a792066ff864b6c9eadfa0d11e82cba678fa3742c38680dcd693050695f7d86" exitCode=0 Mar 19 09:53:44 crc kubenswrapper[4835]: I0319 09:53:44.217344 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerID="a5a1af55b1fa7fb11130cc41302d05f94330b6621f6b0a878733eb89d5a76b8b" exitCode=0 Mar 19 09:53:44 crc kubenswrapper[4835]: I0319 09:53:44.217051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerDied","Data":"0a792066ff864b6c9eadfa0d11e82cba678fa3742c38680dcd693050695f7d86"} Mar 19 09:53:44 crc kubenswrapper[4835]: I0319 09:53:44.217460 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerDied","Data":"a5a1af55b1fa7fb11130cc41302d05f94330b6621f6b0a878733eb89d5a76b8b"} Mar 19 09:53:46 crc kubenswrapper[4835]: I0319 09:53:46.242790 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerID="0abec8912e1e3cf610978870c8382031d5df50d077218567a796c1f72aff6f06" exitCode=0 Mar 19 09:53:46 crc kubenswrapper[4835]: I0319 09:53:46.242944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerDied","Data":"0abec8912e1e3cf610978870c8382031d5df50d077218567a796c1f72aff6f06"} Mar 19 09:53:47 crc kubenswrapper[4835]: I0319 09:53:47.254632 4835 generic.go:334] "Generic (PLEG): container finished" podID="a0976232-95f7-45c1-8025-5adf6861e278" containerID="acc521056eaf4174f0f0b62652ea0b9ad44ddc445449f845527626e65933a493" exitCode=0 Mar 19 09:53:47 crc kubenswrapper[4835]: I0319 09:53:47.254719 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" event={"ID":"a0976232-95f7-45c1-8025-5adf6861e278","Type":"ContainerDied","Data":"acc521056eaf4174f0f0b62652ea0b9ad44ddc445449f845527626e65933a493"} Mar 19 09:53:47 crc kubenswrapper[4835]: I0319 09:53:47.771905 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" containerID="cri-o://6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a" gracePeriod=604796 Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.811993 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.932967 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory\") pod \"a0976232-95f7-45c1-8025-5adf6861e278\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.933303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-828qd\" (UniqueName: \"kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd\") pod \"a0976232-95f7-45c1-8025-5adf6861e278\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.933685 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam\") pod \"a0976232-95f7-45c1-8025-5adf6861e278\" (UID: \"a0976232-95f7-45c1-8025-5adf6861e278\") " Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.939969 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd" (OuterVolumeSpecName: "kube-api-access-828qd") pod "a0976232-95f7-45c1-8025-5adf6861e278" (UID: "a0976232-95f7-45c1-8025-5adf6861e278"). InnerVolumeSpecName "kube-api-access-828qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.973819 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0976232-95f7-45c1-8025-5adf6861e278" (UID: "a0976232-95f7-45c1-8025-5adf6861e278"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:48 crc kubenswrapper[4835]: I0319 09:53:48.974605 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory" (OuterVolumeSpecName: "inventory") pod "a0976232-95f7-45c1-8025-5adf6861e278" (UID: "a0976232-95f7-45c1-8025-5adf6861e278"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.036569 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.036609 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0976232-95f7-45c1-8025-5adf6861e278-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.036620 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-828qd\" (UniqueName: \"kubernetes.io/projected/a0976232-95f7-45c1-8025-5adf6861e278-kube-api-access-828qd\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.356259 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" event={"ID":"a0976232-95f7-45c1-8025-5adf6861e278","Type":"ContainerDied","Data":"9fa3f492944e6234b7d58bf591c44bce9eee49a66f6791a07d700cf2f5ac3bce"} Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.356643 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa3f492944e6234b7d58bf591c44bce9eee49a66f6791a07d700cf2f5ac3bce" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.356443 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-mhc4n" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.362754 4835 generic.go:334] "Generic (PLEG): container finished" podID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerID="e1f5e7c55c5ae5d9b0f812f5e32b130ace81f80ae20441972c84e26135e992a0" exitCode=0 Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.362776 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerDied","Data":"e1f5e7c55c5ae5d9b0f812f5e32b130ace81f80ae20441972c84e26135e992a0"} Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.411328 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8"] Mar 19 09:53:49 crc kubenswrapper[4835]: E0319 09:53:49.411861 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" containerName="aodh-db-sync" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.411875 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" containerName="aodh-db-sync" Mar 19 09:53:49 crc kubenswrapper[4835]: E0319 09:53:49.411924 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0976232-95f7-45c1-8025-5adf6861e278" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.411931 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0976232-95f7-45c1-8025-5adf6861e278" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.412144 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" containerName="aodh-db-sync" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.412164 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0976232-95f7-45c1-8025-5adf6861e278" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.413100 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.415702 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.415715 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.416275 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.416278 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.422401 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8"] Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.486283 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.553021 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.553228 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.553330 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.553605 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6b5s\" (UniqueName: \"kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.655399 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.655871 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwhkw\" (UniqueName: \"kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.655909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.656006 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.656257 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.656295 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle\") pod \"e3a0e12a-0212-4988-8c7a-7d466864887b\" (UID: \"e3a0e12a-0212-4988-8c7a-7d466864887b\") " Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.656704 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6b5s\" (UniqueName: \"kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.656830 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.657028 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.657101 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.661568 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw" (OuterVolumeSpecName: "kube-api-access-nwhkw") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "kube-api-access-nwhkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.664002 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.670153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.676324 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts" (OuterVolumeSpecName: "scripts") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.676979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.677845 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6b5s\" (UniqueName: \"kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.731031 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.742325 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.765545 4835 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.765603 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwhkw\" (UniqueName: \"kubernetes.io/projected/e3a0e12a-0212-4988-8c7a-7d466864887b-kube-api-access-nwhkw\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.765620 4835 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.772483 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.812351 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data" (OuterVolumeSpecName: "config-data") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.837821 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3a0e12a-0212-4988-8c7a-7d466864887b" (UID: "e3a0e12a-0212-4988-8c7a-7d466864887b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.868143 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.868180 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:49 crc kubenswrapper[4835]: I0319 09:53:49.868196 4835 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3a0e12a-0212-4988-8c7a-7d466864887b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.119766 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.382940 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8"] Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.393593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"e3a0e12a-0212-4988-8c7a-7d466864887b","Type":"ContainerDied","Data":"39167dd947c82f81bf51730d8a4ca1ad7191ef88c351d2338f695a4a1e817e9d"} Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.393649 4835 scope.go:117] "RemoveContainer" containerID="e1f5e7c55c5ae5d9b0f812f5e32b130ace81f80ae20441972c84e26135e992a0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.393679 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.461889 4835 scope.go:117] "RemoveContainer" containerID="0abec8912e1e3cf610978870c8382031d5df50d077218567a796c1f72aff6f06" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.468937 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.494198 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.496493 4835 scope.go:117] "RemoveContainer" containerID="0a792066ff864b6c9eadfa0d11e82cba678fa3742c38680dcd693050695f7d86" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.510963 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:50 crc kubenswrapper[4835]: E0319 09:53:50.512907 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-evaluator" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.512938 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-evaluator" Mar 19 09:53:50 crc kubenswrapper[4835]: E0319 09:53:50.512979 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-api" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.512990 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-api" Mar 19 09:53:50 crc kubenswrapper[4835]: E0319 09:53:50.513023 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-notifier" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513033 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-notifier" Mar 19 09:53:50 crc kubenswrapper[4835]: E0319 09:53:50.513049 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-listener" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513058 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-listener" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513423 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-evaluator" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513462 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-notifier" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513489 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-api" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.513501 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" containerName="aodh-listener" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.515932 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.518325 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.518861 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-fxchn" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.519056 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.519230 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.519375 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.522218 4835 scope.go:117] "RemoveContainer" containerID="a5a1af55b1fa7fb11130cc41302d05f94330b6621f6b0a878733eb89d5a76b8b" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.531010 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697092 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrpp\" (UniqueName: \"kubernetes.io/projected/d04558a6-e35c-4376-9893-1cf4865a711b-kube-api-access-hnrpp\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697252 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-public-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697330 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-scripts\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697441 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-internal-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.697523 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-config-data\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.799753 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-scripts\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.799857 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-internal-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.799930 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-config-data\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.799974 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrpp\" (UniqueName: \"kubernetes.io/projected/d04558a6-e35c-4376-9893-1cf4865a711b-kube-api-access-hnrpp\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.800009 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.800074 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-public-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.806399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-internal-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.806412 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-public-tls-certs\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.809602 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-config-data\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.812309 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-scripts\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.813596 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04558a6-e35c-4376-9893-1cf4865a711b-combined-ca-bundle\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.819399 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrpp\" (UniqueName: \"kubernetes.io/projected/d04558a6-e35c-4376-9893-1cf4865a711b-kube-api-access-hnrpp\") pod \"aodh-0\" (UID: \"d04558a6-e35c-4376-9893-1cf4865a711b\") " pod="openstack/aodh-0" Mar 19 09:53:50 crc kubenswrapper[4835]: I0319 09:53:50.844242 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.325077 4835 scope.go:117] "RemoveContainer" containerID="56bc0f4cbc81b8de82bdb7e1483474926f06bf1097472c7e50752965da149097" Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.338824 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.381000 4835 scope.go:117] "RemoveContainer" containerID="9982c40df49c0c62443658109b845382155cff7eb41e05d10c473e87e3de734a" Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.419914 4835 scope.go:117] "RemoveContainer" containerID="091c3a8c428655cd1829849fef315dfcc00ba127038164eb8bce08bf31962482" Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.420178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" event={"ID":"63daeb0b-2ace-4747-b331-44ed485faec8","Type":"ContainerStarted","Data":"d722db717ab1ce16e754e423653db8a57b5d52eea72a46a4e35c5ad9fb4f63cd"} Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.420224 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" event={"ID":"63daeb0b-2ace-4747-b331-44ed485faec8","Type":"ContainerStarted","Data":"02152f61c5771cc0459b8c13514521b1ae156f810f08a5b42ba4248c77f17cad"} Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.434865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d04558a6-e35c-4376-9893-1cf4865a711b","Type":"ContainerStarted","Data":"936cc961f195c51f7d147aa9a42e516b915d98b8dfdf0ed9571445a13554cb74"} Mar 19 09:53:51 crc kubenswrapper[4835]: I0319 09:53:51.446721 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" podStartSLOduration=1.8827823399999999 podStartE2EDuration="2.446701442s" podCreationTimestamp="2026-03-19 09:53:49 +0000 UTC" firstStartedPulling="2026-03-19 09:53:50.393085946 +0000 UTC m=+1885.241684533" lastFinishedPulling="2026-03-19 09:53:50.957005048 +0000 UTC m=+1885.805603635" observedRunningTime="2026-03-19 09:53:51.446131177 +0000 UTC m=+1886.294729774" watchObservedRunningTime="2026-03-19 09:53:51.446701442 +0000 UTC m=+1886.295300029" Mar 19 09:53:52 crc kubenswrapper[4835]: I0319 09:53:52.418230 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a0e12a-0212-4988-8c7a-7d466864887b" path="/var/lib/kubelet/pods/e3a0e12a-0212-4988-8c7a-7d466864887b/volumes" Mar 19 09:53:52 crc kubenswrapper[4835]: I0319 09:53:52.495586 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d04558a6-e35c-4376-9893-1cf4865a711b","Type":"ContainerStarted","Data":"91d4a1376123baa5d3b23801bff1e34a8375df990d78810f502ca30ebd5ad90c"} Mar 19 09:53:53 crc kubenswrapper[4835]: I0319 09:53:53.402095 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:53:53 crc kubenswrapper[4835]: E0319 09:53:53.402968 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 09:53:53 crc kubenswrapper[4835]: I0319 09:53:53.517175 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d04558a6-e35c-4376-9893-1cf4865a711b","Type":"ContainerStarted","Data":"e3fa6c2c54a3ff5bd8fab3d41b67373fc0fd4811a57bd36fcc2ad710f24d51f0"} Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.445376 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.542928 4835 generic.go:334] "Generic (PLEG): container finished" podID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerID="6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a" exitCode=0 Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.543052 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerDied","Data":"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a"} Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.543086 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"2209a56f-9c2a-45bd-b045-176197bf3bd1","Type":"ContainerDied","Data":"56b21f777d30f6496a8ec04108934b2903266ccfaa394a6c66856ef7d70ef37a"} Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.543167 4835 scope.go:117] "RemoveContainer" containerID="6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.543509 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.554057 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d04558a6-e35c-4376-9893-1cf4865a711b","Type":"ContainerStarted","Data":"eff0a74b5299b41327f88fc6ddfe37607eef3668cbfac8265e8dd168f4c4af2c"} Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.593510 4835 scope.go:117] "RemoveContainer" containerID="94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.611466 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.611833 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.611881 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.611934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dg9x\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612027 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612366 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612660 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612798 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612863 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.612934 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.613004 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.613138 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.613196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins\") pod \"2209a56f-9c2a-45bd-b045-176197bf3bd1\" (UID: \"2209a56f-9c2a-45bd-b045-176197bf3bd1\") " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.615419 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.615540 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.615584 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.616973 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.618699 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info" (OuterVolumeSpecName: "pod-info") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.618764 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.620331 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x" (OuterVolumeSpecName: "kube-api-access-5dg9x") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "kube-api-access-5dg9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.662869 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data" (OuterVolumeSpecName: "config-data") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.673380 4835 scope.go:117] "RemoveContainer" containerID="6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a" Mar 19 09:53:54 crc kubenswrapper[4835]: E0319 09:53:54.675300 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a\": container with ID starting with 6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a not found: ID does not exist" containerID="6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.675962 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a"} err="failed to get container status \"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a\": rpc error: code = NotFound desc = could not find container \"6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a\": container with ID starting with 6ff9762632f4fd510b6ef15acf4ab11685aa5d46c48d4951d1c8c4ee529d551a not found: ID does not exist" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.675999 4835 scope.go:117] "RemoveContainer" containerID="94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3" Mar 19 09:53:54 crc kubenswrapper[4835]: E0319 09:53:54.676236 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3\": container with ID starting with 94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3 not found: ID does not exist" containerID="94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.676259 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3"} err="failed to get container status \"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3\": rpc error: code = NotFound desc = could not find container \"94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3\": container with ID starting with 94261c92f99d04084fcc1707edb079459e5eb7fe3076788c702f51ce2d1838e3 not found: ID does not exist" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.695042 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89" (OuterVolumeSpecName: "persistence") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "pvc-3e634ae5-df02-44ea-a231-6b508dce1a89". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.718356 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf" (OuterVolumeSpecName: "server-conf") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719873 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2209a56f-9c2a-45bd-b045-176197bf3bd1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719906 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719917 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719929 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dg9x\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-kube-api-access-5dg9x\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719940 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719971 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") on node \"crc\" " Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719985 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2209a56f-9c2a-45bd-b045-176197bf3bd1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.719997 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2209a56f-9c2a-45bd-b045-176197bf3bd1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.771256 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.771541 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3e634ae5-df02-44ea-a231-6b508dce1a89" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89") on node "crc" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.789853 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2209a56f-9c2a-45bd-b045-176197bf3bd1" (UID: "2209a56f-9c2a-45bd-b045-176197bf3bd1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.824188 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.824536 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2209a56f-9c2a-45bd-b045-176197bf3bd1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.905813 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.924803 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.952469 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:54 crc kubenswrapper[4835]: E0319 09:53:54.953132 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="setup-container" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.953156 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="setup-container" Mar 19 09:53:54 crc kubenswrapper[4835]: E0319 09:53:54.953190 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.953205 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.953416 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" containerName="rabbitmq" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.954774 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:53:54 crc kubenswrapper[4835]: I0319 09:53:54.990505 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.133590 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.133797 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.133945 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a7e887a-3e51-467e-af3b-f26e6126c9e2-pod-info\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134179 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134264 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a7e887a-3e51-467e-af3b-f26e6126c9e2-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134342 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9hlr\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-kube-api-access-x9hlr\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134399 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134486 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134553 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-config-data\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134593 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-server-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.134641 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.236788 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a7e887a-3e51-467e-af3b-f26e6126c9e2-pod-info\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.236841 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.236900 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a7e887a-3e51-467e-af3b-f26e6126c9e2-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.236955 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hlr\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-kube-api-access-x9hlr\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.236988 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237069 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-config-data\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-server-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237128 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237180 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.237293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.239053 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.240611 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.240990 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.241502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-config-data\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.243916 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.245254 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.247600 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a7e887a-3e51-467e-af3b-f26e6126c9e2-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.251638 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a7e887a-3e51-467e-af3b-f26e6126c9e2-server-conf\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.255653 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.255691 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/360c44f69d944d51c872b2f6f2ec804148f8978798dde3b91a4a5d8d6e32ad50/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.255988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a7e887a-3e51-467e-af3b-f26e6126c9e2-pod-info\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.259100 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hlr\" (UniqueName: \"kubernetes.io/projected/3a7e887a-3e51-467e-af3b-f26e6126c9e2-kube-api-access-x9hlr\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.332233 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e634ae5-df02-44ea-a231-6b508dce1a89\") pod \"rabbitmq-server-1\" (UID: \"3a7e887a-3e51-467e-af3b-f26e6126c9e2\") " pod="openstack/rabbitmq-server-1" Mar 19 09:53:55 crc kubenswrapper[4835]: I0319 09:53:55.588300 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 19 09:53:56 crc kubenswrapper[4835]: I0319 09:53:56.229439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 19 09:53:56 crc kubenswrapper[4835]: I0319 09:53:56.428134 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2209a56f-9c2a-45bd-b045-176197bf3bd1" path="/var/lib/kubelet/pods/2209a56f-9c2a-45bd-b045-176197bf3bd1/volumes" Mar 19 09:53:56 crc kubenswrapper[4835]: I0319 09:53:56.580520 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"d04558a6-e35c-4376-9893-1cf4865a711b","Type":"ContainerStarted","Data":"8fefecf60fb7f90d558db294a1087759d39c80cf73facaf2e02b876b2eddb8f3"} Mar 19 09:53:56 crc kubenswrapper[4835]: I0319 09:53:56.588042 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"3a7e887a-3e51-467e-af3b-f26e6126c9e2","Type":"ContainerStarted","Data":"4d4b5cbb48936572f4e52fcab78f45b383a493259d268cdc26b80a71a0cd0bef"} Mar 19 09:53:56 crc kubenswrapper[4835]: I0319 09:53:56.624575 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.343953336 podStartE2EDuration="6.624551271s" podCreationTimestamp="2026-03-19 09:53:50 +0000 UTC" firstStartedPulling="2026-03-19 09:53:51.381007187 +0000 UTC m=+1886.229605774" lastFinishedPulling="2026-03-19 09:53:55.661605122 +0000 UTC m=+1890.510203709" observedRunningTime="2026-03-19 09:53:56.606472835 +0000 UTC m=+1891.455071422" watchObservedRunningTime="2026-03-19 09:53:56.624551271 +0000 UTC m=+1891.473149868" Mar 19 09:53:58 crc kubenswrapper[4835]: I0319 09:53:58.616454 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"3a7e887a-3e51-467e-af3b-f26e6126c9e2","Type":"ContainerStarted","Data":"d2e80da509510c27e24060c6aa3e8ddbea7d92aa4b9ebe29af565ef494603f42"} Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.133435 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565234-jlc9r"] Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.135537 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.138941 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.139181 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.147594 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.148732 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565234-jlc9r"] Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.264349 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6xp\" (UniqueName: \"kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp\") pod \"auto-csr-approver-29565234-jlc9r\" (UID: \"7a19ce6a-83f1-4add-be81-3d5328c2c8fd\") " pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.367167 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6xp\" (UniqueName: \"kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp\") pod \"auto-csr-approver-29565234-jlc9r\" (UID: \"7a19ce6a-83f1-4add-be81-3d5328c2c8fd\") " pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.393019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6xp\" (UniqueName: \"kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp\") pod \"auto-csr-approver-29565234-jlc9r\" (UID: \"7a19ce6a-83f1-4add-be81-3d5328c2c8fd\") " pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.461907 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:00 crc kubenswrapper[4835]: I0319 09:54:00.966908 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565234-jlc9r"] Mar 19 09:54:00 crc kubenswrapper[4835]: W0319 09:54:00.968663 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a19ce6a_83f1_4add_be81_3d5328c2c8fd.slice/crio-e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783 WatchSource:0}: Error finding container e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783: Status 404 returned error can't find the container with id e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783 Mar 19 09:54:01 crc kubenswrapper[4835]: I0319 09:54:01.666540 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" event={"ID":"7a19ce6a-83f1-4add-be81-3d5328c2c8fd","Type":"ContainerStarted","Data":"e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783"} Mar 19 09:54:02 crc kubenswrapper[4835]: I0319 09:54:02.686751 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" event={"ID":"7a19ce6a-83f1-4add-be81-3d5328c2c8fd","Type":"ContainerStarted","Data":"366383eb1a06aca62f7cd3ea3b67cd8644a5c6f7c74898e63827b9037e1e119a"} Mar 19 09:54:02 crc kubenswrapper[4835]: I0319 09:54:02.715144 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" podStartSLOduration=1.818838121 podStartE2EDuration="2.715123648s" podCreationTimestamp="2026-03-19 09:54:00 +0000 UTC" firstStartedPulling="2026-03-19 09:54:00.971339275 +0000 UTC m=+1895.819937862" lastFinishedPulling="2026-03-19 09:54:01.867624802 +0000 UTC m=+1896.716223389" observedRunningTime="2026-03-19 09:54:02.700246537 +0000 UTC m=+1897.548845124" watchObservedRunningTime="2026-03-19 09:54:02.715123648 +0000 UTC m=+1897.563722225" Mar 19 09:54:04 crc kubenswrapper[4835]: I0319 09:54:04.714685 4835 generic.go:334] "Generic (PLEG): container finished" podID="7a19ce6a-83f1-4add-be81-3d5328c2c8fd" containerID="366383eb1a06aca62f7cd3ea3b67cd8644a5c6f7c74898e63827b9037e1e119a" exitCode=0 Mar 19 09:54:04 crc kubenswrapper[4835]: I0319 09:54:04.714788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" event={"ID":"7a19ce6a-83f1-4add-be81-3d5328c2c8fd","Type":"ContainerDied","Data":"366383eb1a06aca62f7cd3ea3b67cd8644a5c6f7c74898e63827b9037e1e119a"} Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.183722 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.294784 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl6xp\" (UniqueName: \"kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp\") pod \"7a19ce6a-83f1-4add-be81-3d5328c2c8fd\" (UID: \"7a19ce6a-83f1-4add-be81-3d5328c2c8fd\") " Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.301063 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp" (OuterVolumeSpecName: "kube-api-access-gl6xp") pod "7a19ce6a-83f1-4add-be81-3d5328c2c8fd" (UID: "7a19ce6a-83f1-4add-be81-3d5328c2c8fd"). InnerVolumeSpecName "kube-api-access-gl6xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.399481 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl6xp\" (UniqueName: \"kubernetes.io/projected/7a19ce6a-83f1-4add-be81-3d5328c2c8fd-kube-api-access-gl6xp\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.742110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" event={"ID":"7a19ce6a-83f1-4add-be81-3d5328c2c8fd","Type":"ContainerDied","Data":"e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783"} Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.742155 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e8c3672df55d3f4baa9f6c669e4e65797a85e0b820abd9c177f21b136f7783" Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.742246 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565234-jlc9r" Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.804937 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565228-zvnlw"] Mar 19 09:54:06 crc kubenswrapper[4835]: I0319 09:54:06.821608 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565228-zvnlw"] Mar 19 09:54:07 crc kubenswrapper[4835]: I0319 09:54:07.402462 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:54:08 crc kubenswrapper[4835]: I0319 09:54:08.567674 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5787acc-561d-4cca-b822-bc7ae5cde5ea" path="/var/lib/kubelet/pods/c5787acc-561d-4cca-b822-bc7ae5cde5ea/volumes" Mar 19 09:54:08 crc kubenswrapper[4835]: I0319 09:54:08.803271 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad"} Mar 19 09:54:31 crc kubenswrapper[4835]: I0319 09:54:31.040085 4835 generic.go:334] "Generic (PLEG): container finished" podID="3a7e887a-3e51-467e-af3b-f26e6126c9e2" containerID="d2e80da509510c27e24060c6aa3e8ddbea7d92aa4b9ebe29af565ef494603f42" exitCode=0 Mar 19 09:54:31 crc kubenswrapper[4835]: I0319 09:54:31.040624 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"3a7e887a-3e51-467e-af3b-f26e6126c9e2","Type":"ContainerDied","Data":"d2e80da509510c27e24060c6aa3e8ddbea7d92aa4b9ebe29af565ef494603f42"} Mar 19 09:54:32 crc kubenswrapper[4835]: I0319 09:54:32.055304 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"3a7e887a-3e51-467e-af3b-f26e6126c9e2","Type":"ContainerStarted","Data":"9752b935897d21b333490d6b4b98de9397a95af9fbf21e39c21f09d4c16164eb"} Mar 19 09:54:32 crc kubenswrapper[4835]: I0319 09:54:32.055977 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 19 09:54:32 crc kubenswrapper[4835]: I0319 09:54:32.079564 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=38.07952283 podStartE2EDuration="38.07952283s" podCreationTimestamp="2026-03-19 09:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:54:32.075684937 +0000 UTC m=+1926.924283544" watchObservedRunningTime="2026-03-19 09:54:32.07952283 +0000 UTC m=+1926.928121437" Mar 19 09:54:45 crc kubenswrapper[4835]: I0319 09:54:45.594032 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 19 09:54:45 crc kubenswrapper[4835]: I0319 09:54:45.676138 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:49 crc kubenswrapper[4835]: I0319 09:54:49.849407 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" containerID="cri-o://79939fe5133102acc25008da72ca46d83f8fa4a45ab7e54544c16a1887d6998d" gracePeriod=604796 Mar 19 09:54:50 crc kubenswrapper[4835]: I0319 09:54:50.036434 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.772551 4835 scope.go:117] "RemoveContainer" containerID="2e5bea20da4fe0d37d396c723aae58e858ade156b04de961ee99399f03ea7842" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.806982 4835 scope.go:117] "RemoveContainer" containerID="542bfdbad4e6ae6ac640534950a273115c1e601a5733933772a0f6544b09a1b8" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.851818 4835 scope.go:117] "RemoveContainer" containerID="a5a235769b9aae535cffdbfea40679b6c14b38ffc77ae5005851256d1d79503e" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.880589 4835 scope.go:117] "RemoveContainer" containerID="b609a8f7c4da2efdabb4bbef3d49b0da202614154807da7d91dc0e8e09b11725" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.906946 4835 scope.go:117] "RemoveContainer" containerID="a9e9091d411b21811d51abd7cb26a4f3593bc327ae9bf9f264c76ea0b5104af7" Mar 19 09:54:51 crc kubenswrapper[4835]: I0319 09:54:51.941715 4835 scope.go:117] "RemoveContainer" containerID="40ef1bdb4eda883d80badcdbbccd9dd1fa6754f0f6149ea68c7af0947900f14a" Mar 19 09:54:52 crc kubenswrapper[4835]: I0319 09:54:52.030965 4835 scope.go:117] "RemoveContainer" containerID="c0ea60bf11bf096f46163378aa34625e8cc0835312390359536ccc50967e573c" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.345797 4835 generic.go:334] "Generic (PLEG): container finished" podID="66c76655-cf6d-45e6-904c-147e07a28639" containerID="79939fe5133102acc25008da72ca46d83f8fa4a45ab7e54544c16a1887d6998d" exitCode=0 Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.345875 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerDied","Data":"79939fe5133102acc25008da72ca46d83f8fa4a45ab7e54544c16a1887d6998d"} Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.570131 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qt75\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719316 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719476 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719651 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.719902 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.720096 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.720266 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.720491 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.721642 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.721846 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins\") pod \"66c76655-cf6d-45e6-904c-147e07a28639\" (UID: \"66c76655-cf6d-45e6-904c-147e07a28639\") " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.721930 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.722192 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.722217 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.723320 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.723416 4835 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.723488 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.726107 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.726464 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info" (OuterVolumeSpecName: "pod-info") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.726566 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.741561 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75" (OuterVolumeSpecName: "kube-api-access-6qt75") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "kube-api-access-6qt75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.747350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60" (OuterVolumeSpecName: "persistence") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.793878 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data" (OuterVolumeSpecName: "config-data") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.800787 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf" (OuterVolumeSpecName: "server-conf") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825197 4835 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/66c76655-cf6d-45e6-904c-147e07a28639-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825233 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825264 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") on node \"crc\" " Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825277 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qt75\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-kube-api-access-6qt75\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825295 4835 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/66c76655-cf6d-45e6-904c-147e07a28639-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825304 4835 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/66c76655-cf6d-45e6-904c-147e07a28639-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.825312 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.876680 4835 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.877153 4835 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60") on node "crc" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.892165 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "66c76655-cf6d-45e6-904c-147e07a28639" (UID: "66c76655-cf6d-45e6-904c-147e07a28639"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.928299 4835 reconciler_common.go:293] "Volume detached for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:56 crc kubenswrapper[4835]: I0319 09:54:56.928331 4835 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/66c76655-cf6d-45e6-904c-147e07a28639-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.361037 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"66c76655-cf6d-45e6-904c-147e07a28639","Type":"ContainerDied","Data":"7173915a29578e8dd3adca834b704f5ae380234913e1dc8c4ed78c6dd1c393b9"} Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.361120 4835 scope.go:117] "RemoveContainer" containerID="79939fe5133102acc25008da72ca46d83f8fa4a45ab7e54544c16a1887d6998d" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.361153 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.397567 4835 scope.go:117] "RemoveContainer" containerID="20a39ba6498a2092f5c30caf21771baa8e854db4767a37e5233443d564935991" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.416006 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.437070 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.473751 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:57 crc kubenswrapper[4835]: E0319 09:54:57.474262 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.474288 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" Mar 19 09:54:57 crc kubenswrapper[4835]: E0319 09:54:57.474340 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="setup-container" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.474348 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="setup-container" Mar 19 09:54:57 crc kubenswrapper[4835]: E0319 09:54:57.474374 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a19ce6a-83f1-4add-be81-3d5328c2c8fd" containerName="oc" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.474382 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a19ce6a-83f1-4add-be81-3d5328c2c8fd" containerName="oc" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.474606 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a19ce6a-83f1-4add-be81-3d5328c2c8fd" containerName="oc" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.474622 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c76655-cf6d-45e6-904c-147e07a28639" containerName="rabbitmq" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.475895 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.491651 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647626 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-config-data\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647717 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647821 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/010a4cf7-6b62-4951-9ee4-1588c57d5c28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647842 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647887 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647902 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sglww\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-kube-api-access-sglww\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647927 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647955 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.647973 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.648003 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/010a4cf7-6b62-4951-9ee4-1588c57d5c28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.648040 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749532 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/010a4cf7-6b62-4951-9ee4-1588c57d5c28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749643 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749665 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sglww\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-kube-api-access-sglww\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749702 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749763 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749794 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749839 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/010a4cf7-6b62-4951-9ee4-1588c57d5c28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749889 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749922 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-config-data\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.749979 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.750579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.750884 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.751960 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-server-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.753247 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.753504 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/010a4cf7-6b62-4951-9ee4-1588c57d5c28-config-data\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.754350 4835 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.754378 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/93d6c24dfb86f6a55f7cf65a0ab47acdafa6a0270fb18fc81a7453795079c631/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.759446 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.759452 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/010a4cf7-6b62-4951-9ee4-1588c57d5c28-pod-info\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.772238 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/010a4cf7-6b62-4951-9ee4-1588c57d5c28-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.774941 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.776457 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sglww\" (UniqueName: \"kubernetes.io/projected/010a4cf7-6b62-4951-9ee4-1588c57d5c28-kube-api-access-sglww\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:57 crc kubenswrapper[4835]: I0319 09:54:57.826947 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-73f7e027-4ad8-4bb1-9cf5-5475f0263c60\") pod \"rabbitmq-server-0\" (UID: \"010a4cf7-6b62-4951-9ee4-1588c57d5c28\") " pod="openstack/rabbitmq-server-0" Mar 19 09:54:58 crc kubenswrapper[4835]: I0319 09:54:58.100287 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 09:54:58 crc kubenswrapper[4835]: I0319 09:54:58.415718 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c76655-cf6d-45e6-904c-147e07a28639" path="/var/lib/kubelet/pods/66c76655-cf6d-45e6-904c-147e07a28639/volumes" Mar 19 09:54:58 crc kubenswrapper[4835]: I0319 09:54:58.628772 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 09:54:59 crc kubenswrapper[4835]: I0319 09:54:59.405673 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"010a4cf7-6b62-4951-9ee4-1588c57d5c28","Type":"ContainerStarted","Data":"31c5deef5075b90ce2aa63c35223b97628d3b3f2d5bb515ccccd117e08736893"} Mar 19 09:55:01 crc kubenswrapper[4835]: I0319 09:55:01.436974 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"010a4cf7-6b62-4951-9ee4-1588c57d5c28","Type":"ContainerStarted","Data":"ea7ca4490d5f9fd9104c5a966d6c7d775345700421e17ea73b13b98cdbeecef3"} Mar 19 09:55:33 crc kubenswrapper[4835]: I0319 09:55:33.852791 4835 generic.go:334] "Generic (PLEG): container finished" podID="010a4cf7-6b62-4951-9ee4-1588c57d5c28" containerID="ea7ca4490d5f9fd9104c5a966d6c7d775345700421e17ea73b13b98cdbeecef3" exitCode=0 Mar 19 09:55:33 crc kubenswrapper[4835]: I0319 09:55:33.852849 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"010a4cf7-6b62-4951-9ee4-1588c57d5c28","Type":"ContainerDied","Data":"ea7ca4490d5f9fd9104c5a966d6c7d775345700421e17ea73b13b98cdbeecef3"} Mar 19 09:55:34 crc kubenswrapper[4835]: I0319 09:55:34.866182 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"010a4cf7-6b62-4951-9ee4-1588c57d5c28","Type":"ContainerStarted","Data":"38ad0d3b8aa6e0d95e24a2feee6170ef8b669127f7f50213438a4dac501a8ef1"} Mar 19 09:55:34 crc kubenswrapper[4835]: I0319 09:55:34.866868 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 09:55:34 crc kubenswrapper[4835]: I0319 09:55:34.909993 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.909968705 podStartE2EDuration="37.909968705s" podCreationTimestamp="2026-03-19 09:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 09:55:34.895734673 +0000 UTC m=+1989.744333280" watchObservedRunningTime="2026-03-19 09:55:34.909968705 +0000 UTC m=+1989.758567292" Mar 19 09:55:48 crc kubenswrapper[4835]: I0319 09:55:48.106116 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 09:55:52 crc kubenswrapper[4835]: I0319 09:55:52.188618 4835 scope.go:117] "RemoveContainer" containerID="4495f034075000d07357758b454e1b548d9f1a0f69e1108d1500ac6041aee844" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.151867 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565236-44rxp"] Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.153898 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.156432 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.156445 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.159359 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.200316 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565236-44rxp"] Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.319375 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8cl\" (UniqueName: \"kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl\") pod \"auto-csr-approver-29565236-44rxp\" (UID: \"e43d35a6-eced-44d9-839a-f4d15941e4c2\") " pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.422177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8cl\" (UniqueName: \"kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl\") pod \"auto-csr-approver-29565236-44rxp\" (UID: \"e43d35a6-eced-44d9-839a-f4d15941e4c2\") " pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.445451 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8cl\" (UniqueName: \"kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl\") pod \"auto-csr-approver-29565236-44rxp\" (UID: \"e43d35a6-eced-44d9-839a-f4d15941e4c2\") " pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.487225 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:00 crc kubenswrapper[4835]: I0319 09:56:00.982231 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565236-44rxp"] Mar 19 09:56:01 crc kubenswrapper[4835]: I0319 09:56:01.157541 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565236-44rxp" event={"ID":"e43d35a6-eced-44d9-839a-f4d15941e4c2","Type":"ContainerStarted","Data":"4438015c8d7ff87d93c41da12ef5eb47c38016f93e6443493231cfdcbdcda8bc"} Mar 19 09:56:03 crc kubenswrapper[4835]: I0319 09:56:03.182180 4835 generic.go:334] "Generic (PLEG): container finished" podID="e43d35a6-eced-44d9-839a-f4d15941e4c2" containerID="0d7344c68f4154e07b8048c217b49a5a7ca84e057bfba57273d2af7b62d4928d" exitCode=0 Mar 19 09:56:03 crc kubenswrapper[4835]: I0319 09:56:03.182463 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565236-44rxp" event={"ID":"e43d35a6-eced-44d9-839a-f4d15941e4c2","Type":"ContainerDied","Data":"0d7344c68f4154e07b8048c217b49a5a7ca84e057bfba57273d2af7b62d4928d"} Mar 19 09:56:04 crc kubenswrapper[4835]: I0319 09:56:04.715099 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:04 crc kubenswrapper[4835]: I0319 09:56:04.754620 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc8cl\" (UniqueName: \"kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl\") pod \"e43d35a6-eced-44d9-839a-f4d15941e4c2\" (UID: \"e43d35a6-eced-44d9-839a-f4d15941e4c2\") " Mar 19 09:56:04 crc kubenswrapper[4835]: I0319 09:56:04.763163 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl" (OuterVolumeSpecName: "kube-api-access-jc8cl") pod "e43d35a6-eced-44d9-839a-f4d15941e4c2" (UID: "e43d35a6-eced-44d9-839a-f4d15941e4c2"). InnerVolumeSpecName "kube-api-access-jc8cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:56:04 crc kubenswrapper[4835]: I0319 09:56:04.858905 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc8cl\" (UniqueName: \"kubernetes.io/projected/e43d35a6-eced-44d9-839a-f4d15941e4c2-kube-api-access-jc8cl\") on node \"crc\" DevicePath \"\"" Mar 19 09:56:05 crc kubenswrapper[4835]: I0319 09:56:05.205360 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565236-44rxp" event={"ID":"e43d35a6-eced-44d9-839a-f4d15941e4c2","Type":"ContainerDied","Data":"4438015c8d7ff87d93c41da12ef5eb47c38016f93e6443493231cfdcbdcda8bc"} Mar 19 09:56:05 crc kubenswrapper[4835]: I0319 09:56:05.205629 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4438015c8d7ff87d93c41da12ef5eb47c38016f93e6443493231cfdcbdcda8bc" Mar 19 09:56:05 crc kubenswrapper[4835]: I0319 09:56:05.205682 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565236-44rxp" Mar 19 09:56:05 crc kubenswrapper[4835]: I0319 09:56:05.789805 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565230-k576b"] Mar 19 09:56:05 crc kubenswrapper[4835]: I0319 09:56:05.804733 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565230-k576b"] Mar 19 09:56:06 crc kubenswrapper[4835]: I0319 09:56:06.415148 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716e3a02-418e-4d74-a63c-43b2e51ef67e" path="/var/lib/kubelet/pods/716e3a02-418e-4d74-a63c-43b2e51ef67e/volumes" Mar 19 09:56:21 crc kubenswrapper[4835]: I0319 09:56:21.029227 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7f31-account-create-update-wdjbq"] Mar 19 09:56:21 crc kubenswrapper[4835]: I0319 09:56:21.043278 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7f31-account-create-update-wdjbq"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.045104 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5vwdz"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.059191 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rv6q7"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.069706 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2eab-account-create-update-9n5z8"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.080950 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-6vl25"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.092157 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5vwdz"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.103185 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2eab-account-create-update-9n5z8"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.114187 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rv6q7"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.124845 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-6vl25"] Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.429915 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395a12a5-d5f4-4917-8d8e-dd9f06fa1780" path="/var/lib/kubelet/pods/395a12a5-d5f4-4917-8d8e-dd9f06fa1780/volumes" Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.432339 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe49504-b4bb-41e3-99c5-ce1e8d36296a" path="/var/lib/kubelet/pods/6fe49504-b4bb-41e3-99c5-ce1e8d36296a/volumes" Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.433268 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88297ff9-78bd-4491-9acd-e0a1f0660b0f" path="/var/lib/kubelet/pods/88297ff9-78bd-4491-9acd-e0a1f0660b0f/volumes" Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.434217 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb" path="/var/lib/kubelet/pods/9a305b14-a9cf-45cf-9ff2-8aa068a9ceeb/volumes" Mar 19 09:56:22 crc kubenswrapper[4835]: I0319 09:56:22.436836 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca9abbb-a41a-4886-9262-f7bd98c0ce48" path="/var/lib/kubelet/pods/cca9abbb-a41a-4886-9262-f7bd98c0ce48/volumes" Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.064370 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2hdqc"] Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.076239 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3383-account-create-update-cv6wv"] Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.088668 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2hdqc"] Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.098875 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3383-account-create-update-cv6wv"] Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.421076 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab86bcc8-164f-4f89-9d47-e52d3520ea21" path="/var/lib/kubelet/pods/ab86bcc8-164f-4f89-9d47-e52d3520ea21/volumes" Mar 19 09:56:24 crc kubenswrapper[4835]: I0319 09:56:24.425226 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6615456-cf93-49d5-b69c-a83dcbab99da" path="/var/lib/kubelet/pods/f6615456-cf93-49d5-b69c-a83dcbab99da/volumes" Mar 19 09:56:25 crc kubenswrapper[4835]: I0319 09:56:25.033132 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-fac5-account-create-update-hhdpx"] Mar 19 09:56:25 crc kubenswrapper[4835]: I0319 09:56:25.044302 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-fac5-account-create-update-hhdpx"] Mar 19 09:56:26 crc kubenswrapper[4835]: I0319 09:56:26.428648 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f689499-b2f9-47e8-811a-54bbac418778" path="/var/lib/kubelet/pods/5f689499-b2f9-47e8-811a-54bbac418778/volumes" Mar 19 09:56:29 crc kubenswrapper[4835]: I0319 09:56:29.057376 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-c348-account-create-update-qmvxx"] Mar 19 09:56:29 crc kubenswrapper[4835]: I0319 09:56:29.072084 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-c348-account-create-update-qmvxx"] Mar 19 09:56:30 crc kubenswrapper[4835]: I0319 09:56:30.036134 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-rh62c"] Mar 19 09:56:30 crc kubenswrapper[4835]: I0319 09:56:30.051478 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-rh62c"] Mar 19 09:56:30 crc kubenswrapper[4835]: I0319 09:56:30.415469 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0647dcc1-ea1b-4628-92cd-50e53602f0b7" path="/var/lib/kubelet/pods/0647dcc1-ea1b-4628-92cd-50e53602f0b7/volumes" Mar 19 09:56:30 crc kubenswrapper[4835]: I0319 09:56:30.416938 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca5984a-072e-4b36-840c-4a986fcd553e" path="/var/lib/kubelet/pods/8ca5984a-072e-4b36-840c-4a986fcd553e/volumes" Mar 19 09:56:36 crc kubenswrapper[4835]: I0319 09:56:36.423072 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:56:36 crc kubenswrapper[4835]: I0319 09:56:36.423634 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:56:42 crc kubenswrapper[4835]: I0319 09:56:42.621809 4835 generic.go:334] "Generic (PLEG): container finished" podID="63daeb0b-2ace-4747-b331-44ed485faec8" containerID="d722db717ab1ce16e754e423653db8a57b5d52eea72a46a4e35c5ad9fb4f63cd" exitCode=0 Mar 19 09:56:42 crc kubenswrapper[4835]: I0319 09:56:42.621880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" event={"ID":"63daeb0b-2ace-4747-b331-44ed485faec8","Type":"ContainerDied","Data":"d722db717ab1ce16e754e423653db8a57b5d52eea72a46a4e35c5ad9fb4f63cd"} Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.217608 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.250905 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory\") pod \"63daeb0b-2ace-4747-b331-44ed485faec8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.250981 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle\") pod \"63daeb0b-2ace-4747-b331-44ed485faec8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.251037 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam\") pod \"63daeb0b-2ace-4747-b331-44ed485faec8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.251071 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6b5s\" (UniqueName: \"kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s\") pod \"63daeb0b-2ace-4747-b331-44ed485faec8\" (UID: \"63daeb0b-2ace-4747-b331-44ed485faec8\") " Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.262637 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s" (OuterVolumeSpecName: "kube-api-access-p6b5s") pod "63daeb0b-2ace-4747-b331-44ed485faec8" (UID: "63daeb0b-2ace-4747-b331-44ed485faec8"). InnerVolumeSpecName "kube-api-access-p6b5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.275346 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "63daeb0b-2ace-4747-b331-44ed485faec8" (UID: "63daeb0b-2ace-4747-b331-44ed485faec8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.293805 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory" (OuterVolumeSpecName: "inventory") pod "63daeb0b-2ace-4747-b331-44ed485faec8" (UID: "63daeb0b-2ace-4747-b331-44ed485faec8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.299567 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "63daeb0b-2ace-4747-b331-44ed485faec8" (UID: "63daeb0b-2ace-4747-b331-44ed485faec8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.353459 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.353500 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.353626 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/63daeb0b-2ace-4747-b331-44ed485faec8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.353646 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6b5s\" (UniqueName: \"kubernetes.io/projected/63daeb0b-2ace-4747-b331-44ed485faec8-kube-api-access-p6b5s\") on node \"crc\" DevicePath \"\"" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.646160 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" event={"ID":"63daeb0b-2ace-4747-b331-44ed485faec8","Type":"ContainerDied","Data":"02152f61c5771cc0459b8c13514521b1ae156f810f08a5b42ba4248c77f17cad"} Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.646207 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02152f61c5771cc0459b8c13514521b1ae156f810f08a5b42ba4248c77f17cad" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.646296 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.751073 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2"] Mar 19 09:56:44 crc kubenswrapper[4835]: E0319 09:56:44.751655 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63daeb0b-2ace-4747-b331-44ed485faec8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.751673 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="63daeb0b-2ace-4747-b331-44ed485faec8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 09:56:44 crc kubenswrapper[4835]: E0319 09:56:44.751689 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e43d35a6-eced-44d9-839a-f4d15941e4c2" containerName="oc" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.751696 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e43d35a6-eced-44d9-839a-f4d15941e4c2" containerName="oc" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.751898 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e43d35a6-eced-44d9-839a-f4d15941e4c2" containerName="oc" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.751919 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="63daeb0b-2ace-4747-b331-44ed485faec8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.752848 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.755526 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.756424 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.756668 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.756821 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.767942 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.768167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.768217 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6snhs\" (UniqueName: \"kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.814336 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2"] Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.872048 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.872251 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.872289 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6snhs\" (UniqueName: \"kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.879474 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.883681 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:44 crc kubenswrapper[4835]: I0319 09:56:44.891979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6snhs\" (UniqueName: \"kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:45 crc kubenswrapper[4835]: I0319 09:56:45.085590 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:56:45 crc kubenswrapper[4835]: I0319 09:56:45.665366 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2"] Mar 19 09:56:46 crc kubenswrapper[4835]: I0319 09:56:46.677591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" event={"ID":"1073365c-6688-438d-ad77-2c32ee7e6947","Type":"ContainerStarted","Data":"f0b9de09adc8610d146e59e2eac460f698a3fd0f4ac040ee8e453863dd443dd6"} Mar 19 09:56:46 crc kubenswrapper[4835]: I0319 09:56:46.678143 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" event={"ID":"1073365c-6688-438d-ad77-2c32ee7e6947","Type":"ContainerStarted","Data":"976eb4500ccadba5aadcffd6964c1d4827f4478594963ac3d5c66c5cd2344e71"} Mar 19 09:56:46 crc kubenswrapper[4835]: I0319 09:56:46.704593 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" podStartSLOduration=2.17122481 podStartE2EDuration="2.704569919s" podCreationTimestamp="2026-03-19 09:56:44 +0000 UTC" firstStartedPulling="2026-03-19 09:56:45.653606944 +0000 UTC m=+2060.502205531" lastFinishedPulling="2026-03-19 09:56:46.186952053 +0000 UTC m=+2061.035550640" observedRunningTime="2026-03-19 09:56:46.695234618 +0000 UTC m=+2061.543833205" watchObservedRunningTime="2026-03-19 09:56:46.704569919 +0000 UTC m=+2061.553168506" Mar 19 09:56:48 crc kubenswrapper[4835]: I0319 09:56:48.054286 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-lf6bb"] Mar 19 09:56:48 crc kubenswrapper[4835]: I0319 09:56:48.071213 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-lf6bb"] Mar 19 09:56:48 crc kubenswrapper[4835]: I0319 09:56:48.418910 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822505cc-9aba-45f0-8e65-3e4393db08f4" path="/var/lib/kubelet/pods/822505cc-9aba-45f0-8e65-3e4393db08f4/volumes" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.295891 4835 scope.go:117] "RemoveContainer" containerID="2d435cae20d88b390282bfa5cb87a0cf2f09ac071e9ea94af334dd7710f35551" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.329367 4835 scope.go:117] "RemoveContainer" containerID="ce75e54be095e2271d27bd56cf7f1e1a9313c5f6b27fff71249621b9f0a18a88" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.388704 4835 scope.go:117] "RemoveContainer" containerID="68fbf8e477c2c0d0fa051fb2488b6fe60ab3f1e2975d56f66359b4e75d4ccc4f" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.470412 4835 scope.go:117] "RemoveContainer" containerID="358d9b1bf5b265895614261070f151bf40471da3cf9b7739a531314637546af1" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.523940 4835 scope.go:117] "RemoveContainer" containerID="12783958acd0d7509484ec7be8b19d8522f5e847cfb690f3ce9451da93445e93" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.586775 4835 scope.go:117] "RemoveContainer" containerID="9b6da810c636e1b3400449e71f5105147de8f0243b58b0a004864d1e4be6e552" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.626205 4835 scope.go:117] "RemoveContainer" containerID="52d7709cd02621ee10aa0637fdf0d5e358afa94bfa5930c5222b5cb8ddc7ff41" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.646267 4835 scope.go:117] "RemoveContainer" containerID="47f0dc4adf7815a272f96cd54d5674b6e3cc4b22f40bee83110cdea6a4301556" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.674849 4835 scope.go:117] "RemoveContainer" containerID="6040eb1bad378f6d6b138fdb9e2b08cd9b49b0dae2584eb73ba09bf7889f2f54" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.708778 4835 scope.go:117] "RemoveContainer" containerID="e04602a50138388c4cd67388f10ac76559e131f9d5e67fa5eb8258e1b9270047" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.731008 4835 scope.go:117] "RemoveContainer" containerID="77f5e5e525995019516ce9f427feb4956988015516b64086ba33d8b5e0506ffc" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.753851 4835 scope.go:117] "RemoveContainer" containerID="d34528e835b8627d814da5f37f7910078cedc1a4fa0c16371e914e4a61818f39" Mar 19 09:56:52 crc kubenswrapper[4835]: I0319 09:56:52.774860 4835 scope.go:117] "RemoveContainer" containerID="0bc1ee11ba5c200dec1401debb71c733b19a73833695f48a85abba2718e65391" Mar 19 09:56:55 crc kubenswrapper[4835]: I0319 09:56:55.035557 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-bp5cn"] Mar 19 09:56:55 crc kubenswrapper[4835]: I0319 09:56:55.047683 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-bp5cn"] Mar 19 09:56:56 crc kubenswrapper[4835]: I0319 09:56:56.419703 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441782eb-a245-4484-bf4c-de0c77ca19c2" path="/var/lib/kubelet/pods/441782eb-a245-4484-bf4c-de0c77ca19c2/volumes" Mar 19 09:57:06 crc kubenswrapper[4835]: I0319 09:57:06.421965 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:57:06 crc kubenswrapper[4835]: I0319 09:57:06.423167 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.054791 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-2hr8p"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.069426 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8a40-account-create-update-ql6kp"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.086397 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-72fa-account-create-update-dj55b"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.096553 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-4f53-account-create-update-77jbh"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.106857 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v6pb2"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.117574 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ed21-account-create-update-5pvxc"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.128302 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-2hr8p"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.138168 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jggwj"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.161494 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bd2xq"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.171559 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ed21-account-create-update-5pvxc"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.182080 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jggwj"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.193758 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-72fa-account-create-update-dj55b"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.204622 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bd2xq"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.216692 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-4f53-account-create-update-77jbh"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.225890 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8a40-account-create-update-ql6kp"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.240247 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v6pb2"] Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.429068 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09fbcd28-9104-4ddf-9e9f-8d7029c330bf" path="/var/lib/kubelet/pods/09fbcd28-9104-4ddf-9e9f-8d7029c330bf/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.430231 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af06506-57b0-4b38-b57a-701b74ab2866" path="/var/lib/kubelet/pods/0af06506-57b0-4b38-b57a-701b74ab2866/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.434456 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="378b5128-0ef4-44b3-bfdd-0956f404c583" path="/var/lib/kubelet/pods/378b5128-0ef4-44b3-bfdd-0956f404c583/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.437300 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4d6aef-f764-455c-a076-cf225b70b33c" path="/var/lib/kubelet/pods/3e4d6aef-f764-455c-a076-cf225b70b33c/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.440715 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5678f837-9cc0-4a1c-a32b-70b4e303c350" path="/var/lib/kubelet/pods/5678f837-9cc0-4a1c-a32b-70b4e303c350/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.443964 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64" path="/var/lib/kubelet/pods/8bd1dfd6-8c2d-45cc-9bea-2b30b7e52f64/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.446433 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9a0780-2d49-40d7-83f9-2f6fcec39523" path="/var/lib/kubelet/pods/8c9a0780-2d49-40d7-83f9-2f6fcec39523/volumes" Mar 19 09:57:10 crc kubenswrapper[4835]: I0319 09:57:10.448320 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47e492d-e555-4219-92f5-ef1f36ad41c5" path="/var/lib/kubelet/pods/d47e492d-e555-4219-92f5-ef1f36ad41c5/volumes" Mar 19 09:57:15 crc kubenswrapper[4835]: I0319 09:57:15.035944 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5qg94"] Mar 19 09:57:15 crc kubenswrapper[4835]: I0319 09:57:15.049763 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5qg94"] Mar 19 09:57:16 crc kubenswrapper[4835]: I0319 09:57:16.417462 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57dd95b0-04b3-4903-ac51-05ad58cefcf5" path="/var/lib/kubelet/pods/57dd95b0-04b3-4903-ac51-05ad58cefcf5/volumes" Mar 19 09:57:36 crc kubenswrapper[4835]: I0319 09:57:36.421733 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:57:36 crc kubenswrapper[4835]: I0319 09:57:36.422406 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:57:36 crc kubenswrapper[4835]: I0319 09:57:36.438233 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 09:57:36 crc kubenswrapper[4835]: I0319 09:57:36.440092 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 09:57:36 crc kubenswrapper[4835]: I0319 09:57:36.440207 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad" gracePeriod=600 Mar 19 09:57:36 crc kubenswrapper[4835]: E0319 09:57:36.545924 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf367e5_fedd_4d9e_a7af_345df1f08353.slice/crio-15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf367e5_fedd_4d9e_a7af_345df1f08353.slice/crio-conmon-15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad.scope\": RecentStats: unable to find data in memory cache]" Mar 19 09:57:37 crc kubenswrapper[4835]: I0319 09:57:37.340393 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad" exitCode=0 Mar 19 09:57:37 crc kubenswrapper[4835]: I0319 09:57:37.340442 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad"} Mar 19 09:57:37 crc kubenswrapper[4835]: I0319 09:57:37.340920 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f"} Mar 19 09:57:37 crc kubenswrapper[4835]: I0319 09:57:37.340948 4835 scope.go:117] "RemoveContainer" containerID="d93f2f0fef5a3fe52d6e4aab02e5290ac85405643bc520caaef82b7b23fd8ee3" Mar 19 09:57:47 crc kubenswrapper[4835]: I0319 09:57:47.066343 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-c5lzw"] Mar 19 09:57:47 crc kubenswrapper[4835]: I0319 09:57:47.077698 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-c5lzw"] Mar 19 09:57:48 crc kubenswrapper[4835]: I0319 09:57:48.427700 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf7c073-aa50-4b3b-aedb-7e77be63f85a" path="/var/lib/kubelet/pods/cdf7c073-aa50-4b3b-aedb-7e77be63f85a/volumes" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.695383 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.699918 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.706621 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.812664 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.812975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfmr\" (UniqueName: \"kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.813483 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.888302 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.890837 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.908624 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.917461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.917578 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.917689 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfmr\" (UniqueName: \"kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.917983 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.918530 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:52 crc kubenswrapper[4835]: I0319 09:57:52.948729 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfmr\" (UniqueName: \"kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr\") pod \"community-operators-b7lxk\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.019227 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gztl9\" (UniqueName: \"kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.019810 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.019848 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.062640 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.076768 4835 scope.go:117] "RemoveContainer" containerID="6661211b794df72a5da51ecbae09e55717a95b7f0751f47836827a97095b4745" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.123533 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.123590 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.123703 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gztl9\" (UniqueName: \"kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.124262 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.124497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.143251 4835 scope.go:117] "RemoveContainer" containerID="bfdb3d9a3f546d0f8f972c5609f39ce5804f8773c23188edc67a7d9d445a9a34" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.149585 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gztl9\" (UniqueName: \"kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9\") pod \"certified-operators-mbhpt\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.211576 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.295894 4835 scope.go:117] "RemoveContainer" containerID="0755e58c2018ed18c4e26fc17ed86fd63a6214a090aa486a80de7e59c3d22bad" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.469871 4835 scope.go:117] "RemoveContainer" containerID="5892f1f672103d6dcb8484f3156fe6192a7117db7deab0d80d609b3dba5c4dfb" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.511028 4835 scope.go:117] "RemoveContainer" containerID="8839e0e09f529dd580390bd8bf319bf7cf31905425247c60454359b2a79530ae" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.588498 4835 scope.go:117] "RemoveContainer" containerID="f5132bd050fd77a6f296b03946881f806e7287dad44e9e5c62ff57a965fa930d" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.624940 4835 scope.go:117] "RemoveContainer" containerID="137d6de81bc70fdd5e2a8bc3c1cd57619f849b134667c0709d7b87541570ceae" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.662093 4835 scope.go:117] "RemoveContainer" containerID="183e270f0a4b7455c63e8606035068f346d5b14cc376addd9580d0a57304a81b" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.723059 4835 scope.go:117] "RemoveContainer" containerID="8a4c39f9b4e73c41b323d40cb52f3e574c298f35d66a42754a7fde2dc612feff" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.787456 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.788927 4835 scope.go:117] "RemoveContainer" containerID="79974ad04ee5703d2e03e6ccdf3b1d17e7c856e0344187a13e1f55ab40a2b9ac" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.850892 4835 scope.go:117] "RemoveContainer" containerID="e9dfbcd95c4d6eb4fc24553b5b14b0d8caa895d497ddeddfdea725fecb9e3d03" Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.910486 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.931864 4835 scope.go:117] "RemoveContainer" containerID="3f20636aa9d4ccf58558d41609cb16e89c7d7be9c3d5449f9069346ed04d8059" Mar 19 09:57:53 crc kubenswrapper[4835]: W0319 09:57:53.943159 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44533f42_724e_441c_bebf_902979aab424.slice/crio-5e3306a1737e0e1d9ee981624ca80a2137791fbbe38883b385f25f3a7b63f57a WatchSource:0}: Error finding container 5e3306a1737e0e1d9ee981624ca80a2137791fbbe38883b385f25f3a7b63f57a: Status 404 returned error can't find the container with id 5e3306a1737e0e1d9ee981624ca80a2137791fbbe38883b385f25f3a7b63f57a Mar 19 09:57:53 crc kubenswrapper[4835]: I0319 09:57:53.975956 4835 scope.go:117] "RemoveContainer" containerID="a5971122082f5207f6845a0c0f278ea5f990d404320e26b84dfb5aa85ce6859c" Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.569933 4835 generic.go:334] "Generic (PLEG): container finished" podID="44533f42-724e-441c-bebf-902979aab424" containerID="0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223" exitCode=0 Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.570280 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerDied","Data":"0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223"} Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.570325 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerStarted","Data":"5e3306a1737e0e1d9ee981624ca80a2137791fbbe38883b385f25f3a7b63f57a"} Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.571986 4835 generic.go:334] "Generic (PLEG): container finished" podID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerID="18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c" exitCode=0 Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.572023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerDied","Data":"18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c"} Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.572057 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerStarted","Data":"4e3dee66f7eb9ccf4871ae7c60f0be8b6fb66c69aa268f4254f0f3d0ae8a08e5"} Mar 19 09:57:54 crc kubenswrapper[4835]: I0319 09:57:54.572647 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 09:57:55 crc kubenswrapper[4835]: I0319 09:57:55.588858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerStarted","Data":"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38"} Mar 19 09:57:55 crc kubenswrapper[4835]: I0319 09:57:55.593433 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerStarted","Data":"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0"} Mar 19 09:57:57 crc kubenswrapper[4835]: I0319 09:57:57.615267 4835 generic.go:334] "Generic (PLEG): container finished" podID="44533f42-724e-441c-bebf-902979aab424" containerID="1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38" exitCode=0 Mar 19 09:57:57 crc kubenswrapper[4835]: I0319 09:57:57.615486 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerDied","Data":"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38"} Mar 19 09:57:57 crc kubenswrapper[4835]: I0319 09:57:57.621807 4835 generic.go:334] "Generic (PLEG): container finished" podID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerID="27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0" exitCode=0 Mar 19 09:57:57 crc kubenswrapper[4835]: I0319 09:57:57.621850 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerDied","Data":"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0"} Mar 19 09:57:58 crc kubenswrapper[4835]: I0319 09:57:58.636357 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerStarted","Data":"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02"} Mar 19 09:57:58 crc kubenswrapper[4835]: I0319 09:57:58.641185 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerStarted","Data":"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39"} Mar 19 09:57:58 crc kubenswrapper[4835]: I0319 09:57:58.664030 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b7lxk" podStartSLOduration=3.198418355 podStartE2EDuration="6.664012392s" podCreationTimestamp="2026-03-19 09:57:52 +0000 UTC" firstStartedPulling="2026-03-19 09:57:54.574491822 +0000 UTC m=+2129.423090449" lastFinishedPulling="2026-03-19 09:57:58.040085899 +0000 UTC m=+2132.888684486" observedRunningTime="2026-03-19 09:57:58.659199669 +0000 UTC m=+2133.507798256" watchObservedRunningTime="2026-03-19 09:57:58.664012392 +0000 UTC m=+2133.512610969" Mar 19 09:57:58 crc kubenswrapper[4835]: I0319 09:57:58.684324 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mbhpt" podStartSLOduration=3.150876463 podStartE2EDuration="6.684300908s" podCreationTimestamp="2026-03-19 09:57:52 +0000 UTC" firstStartedPulling="2026-03-19 09:57:54.572402925 +0000 UTC m=+2129.421001512" lastFinishedPulling="2026-03-19 09:57:58.10582737 +0000 UTC m=+2132.954425957" observedRunningTime="2026-03-19 09:57:58.677066069 +0000 UTC m=+2133.525664656" watchObservedRunningTime="2026-03-19 09:57:58.684300908 +0000 UTC m=+2133.532899495" Mar 19 09:57:59 crc kubenswrapper[4835]: I0319 09:57:59.061169 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2rdm5"] Mar 19 09:57:59 crc kubenswrapper[4835]: I0319 09:57:59.081649 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xrdjb"] Mar 19 09:57:59 crc kubenswrapper[4835]: I0319 09:57:59.097642 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2rdm5"] Mar 19 09:57:59 crc kubenswrapper[4835]: I0319 09:57:59.111757 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xrdjb"] Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.149057 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565238-5w6jr"] Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.151071 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.153370 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.154847 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.155262 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.161900 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565238-5w6jr"] Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.312767 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxnpq\" (UniqueName: \"kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq\") pod \"auto-csr-approver-29565238-5w6jr\" (UID: \"6ebb674c-a705-418a-92ea-771ab5d01a39\") " pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.417592 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxnpq\" (UniqueName: \"kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq\") pod \"auto-csr-approver-29565238-5w6jr\" (UID: \"6ebb674c-a705-418a-92ea-771ab5d01a39\") " pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.428303 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08dbbf0e-f67b-448a-b9b6-cbd738b6bf67" path="/var/lib/kubelet/pods/08dbbf0e-f67b-448a-b9b6-cbd738b6bf67/volumes" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.433355 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777ab445-6d41-49a9-b87c-67ed7503cc5b" path="/var/lib/kubelet/pods/777ab445-6d41-49a9-b87c-67ed7503cc5b/volumes" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.448109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxnpq\" (UniqueName: \"kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq\") pod \"auto-csr-approver-29565238-5w6jr\" (UID: \"6ebb674c-a705-418a-92ea-771ab5d01a39\") " pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.516919 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:00 crc kubenswrapper[4835]: I0319 09:58:00.976177 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565238-5w6jr"] Mar 19 09:58:00 crc kubenswrapper[4835]: W0319 09:58:00.978273 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ebb674c_a705_418a_92ea_771ab5d01a39.slice/crio-df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0 WatchSource:0}: Error finding container df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0: Status 404 returned error can't find the container with id df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0 Mar 19 09:58:01 crc kubenswrapper[4835]: I0319 09:58:01.687787 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" event={"ID":"6ebb674c-a705-418a-92ea-771ab5d01a39","Type":"ContainerStarted","Data":"df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0"} Mar 19 09:58:02 crc kubenswrapper[4835]: I0319 09:58:02.701656 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" event={"ID":"6ebb674c-a705-418a-92ea-771ab5d01a39","Type":"ContainerStarted","Data":"f6b75343ca481c064668c749bb81ab1be942e8b7b81f8ead4f5c6b377818aba7"} Mar 19 09:58:02 crc kubenswrapper[4835]: I0319 09:58:02.728194 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" podStartSLOduration=1.9093642100000001 podStartE2EDuration="2.728175536s" podCreationTimestamp="2026-03-19 09:58:00 +0000 UTC" firstStartedPulling="2026-03-19 09:58:00.980459988 +0000 UTC m=+2135.829058575" lastFinishedPulling="2026-03-19 09:58:01.799271274 +0000 UTC m=+2136.647869901" observedRunningTime="2026-03-19 09:58:02.717888974 +0000 UTC m=+2137.566487581" watchObservedRunningTime="2026-03-19 09:58:02.728175536 +0000 UTC m=+2137.576774123" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.063660 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.063759 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.116211 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.211898 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.212320 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.714457 4835 generic.go:334] "Generic (PLEG): container finished" podID="6ebb674c-a705-418a-92ea-771ab5d01a39" containerID="f6b75343ca481c064668c749bb81ab1be942e8b7b81f8ead4f5c6b377818aba7" exitCode=0 Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.715913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" event={"ID":"6ebb674c-a705-418a-92ea-771ab5d01a39","Type":"ContainerDied","Data":"f6b75343ca481c064668c749bb81ab1be942e8b7b81f8ead4f5c6b377818aba7"} Mar 19 09:58:03 crc kubenswrapper[4835]: I0319 09:58:03.801789 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:04 crc kubenswrapper[4835]: I0319 09:58:04.284450 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mbhpt" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="registry-server" probeResult="failure" output=< Mar 19 09:58:04 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:58:04 crc kubenswrapper[4835]: > Mar 19 09:58:04 crc kubenswrapper[4835]: I0319 09:58:04.292363 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.151958 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.266406 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxnpq\" (UniqueName: \"kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq\") pod \"6ebb674c-a705-418a-92ea-771ab5d01a39\" (UID: \"6ebb674c-a705-418a-92ea-771ab5d01a39\") " Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.272638 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq" (OuterVolumeSpecName: "kube-api-access-zxnpq") pod "6ebb674c-a705-418a-92ea-771ab5d01a39" (UID: "6ebb674c-a705-418a-92ea-771ab5d01a39"). InnerVolumeSpecName "kube-api-access-zxnpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.369451 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxnpq\" (UniqueName: \"kubernetes.io/projected/6ebb674c-a705-418a-92ea-771ab5d01a39-kube-api-access-zxnpq\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.744995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" event={"ID":"6ebb674c-a705-418a-92ea-771ab5d01a39","Type":"ContainerDied","Data":"df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0"} Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.745562 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8ba0b968e247cb61edc4b265a64f9fd8a64784b47590df0ac2f1b63bc534f0" Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.745360 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b7lxk" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="registry-server" containerID="cri-o://16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02" gracePeriod=2 Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.745051 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565238-5w6jr" Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.791985 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565232-7xdjv"] Mar 19 09:58:05 crc kubenswrapper[4835]: I0319 09:58:05.807809 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565232-7xdjv"] Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.240258 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.296594 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities\") pod \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.296816 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfmr\" (UniqueName: \"kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr\") pod \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.296993 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content\") pod \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\" (UID: \"db7fb3de-fa8a-4783-ac0d-746939c9e8d4\") " Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.297566 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities" (OuterVolumeSpecName: "utilities") pod "db7fb3de-fa8a-4783-ac0d-746939c9e8d4" (UID: "db7fb3de-fa8a-4783-ac0d-746939c9e8d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.298099 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.303586 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr" (OuterVolumeSpecName: "kube-api-access-kjfmr") pod "db7fb3de-fa8a-4783-ac0d-746939c9e8d4" (UID: "db7fb3de-fa8a-4783-ac0d-746939c9e8d4"). InnerVolumeSpecName "kube-api-access-kjfmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.357562 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db7fb3de-fa8a-4783-ac0d-746939c9e8d4" (UID: "db7fb3de-fa8a-4783-ac0d-746939c9e8d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.400059 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfmr\" (UniqueName: \"kubernetes.io/projected/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-kube-api-access-kjfmr\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.400096 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7fb3de-fa8a-4783-ac0d-746939c9e8d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.416020 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a88ab97-e853-45f0-9c23-61182401b914" path="/var/lib/kubelet/pods/5a88ab97-e853-45f0-9c23-61182401b914/volumes" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.757600 4835 generic.go:334] "Generic (PLEG): container finished" podID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerID="16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02" exitCode=0 Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.757660 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerDied","Data":"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02"} Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.757995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b7lxk" event={"ID":"db7fb3de-fa8a-4783-ac0d-746939c9e8d4","Type":"ContainerDied","Data":"4e3dee66f7eb9ccf4871ae7c60f0be8b6fb66c69aa268f4254f0f3d0ae8a08e5"} Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.757702 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b7lxk" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.758053 4835 scope.go:117] "RemoveContainer" containerID="16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.797484 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.800436 4835 scope.go:117] "RemoveContainer" containerID="27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.821969 4835 scope.go:117] "RemoveContainer" containerID="18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.826152 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b7lxk"] Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.883789 4835 scope.go:117] "RemoveContainer" containerID="16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02" Mar 19 09:58:06 crc kubenswrapper[4835]: E0319 09:58:06.884401 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02\": container with ID starting with 16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02 not found: ID does not exist" containerID="16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.884503 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02"} err="failed to get container status \"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02\": rpc error: code = NotFound desc = could not find container \"16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02\": container with ID starting with 16c90a1f26a730d3d5e7df77337a98c6369b087ecc2ef748f35cbf3b5a1d4b02 not found: ID does not exist" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.884596 4835 scope.go:117] "RemoveContainer" containerID="27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0" Mar 19 09:58:06 crc kubenswrapper[4835]: E0319 09:58:06.885173 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0\": container with ID starting with 27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0 not found: ID does not exist" containerID="27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.885283 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0"} err="failed to get container status \"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0\": rpc error: code = NotFound desc = could not find container \"27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0\": container with ID starting with 27cbf31e7280dbe1b01edc4caab9699e604519524867ae200b2dd5e50954bee0 not found: ID does not exist" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.885366 4835 scope.go:117] "RemoveContainer" containerID="18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c" Mar 19 09:58:06 crc kubenswrapper[4835]: E0319 09:58:06.885942 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c\": container with ID starting with 18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c not found: ID does not exist" containerID="18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c" Mar 19 09:58:06 crc kubenswrapper[4835]: I0319 09:58:06.886034 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c"} err="failed to get container status \"18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c\": rpc error: code = NotFound desc = could not find container \"18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c\": container with ID starting with 18904870aad2fb4533f0b66d1495a77d69511621ae6eeb5c9c7c5ee07bd7b64c not found: ID does not exist" Mar 19 09:58:08 crc kubenswrapper[4835]: I0319 09:58:08.416637 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" path="/var/lib/kubelet/pods/db7fb3de-fa8a-4783-ac0d-746939c9e8d4/volumes" Mar 19 09:58:13 crc kubenswrapper[4835]: I0319 09:58:13.347880 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:13 crc kubenswrapper[4835]: I0319 09:58:13.401806 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:13 crc kubenswrapper[4835]: I0319 09:58:13.591369 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:58:14 crc kubenswrapper[4835]: I0319 09:58:14.039766 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mw6s2"] Mar 19 09:58:14 crc kubenswrapper[4835]: I0319 09:58:14.053188 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mw6s2"] Mar 19 09:58:14 crc kubenswrapper[4835]: I0319 09:58:14.423099 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410b03bf-d2bc-4992-b467-db7947c52078" path="/var/lib/kubelet/pods/410b03bf-d2bc-4992-b467-db7947c52078/volumes" Mar 19 09:58:14 crc kubenswrapper[4835]: I0319 09:58:14.887426 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mbhpt" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="registry-server" containerID="cri-o://79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39" gracePeriod=2 Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.821904 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.900952 4835 generic.go:334] "Generic (PLEG): container finished" podID="44533f42-724e-441c-bebf-902979aab424" containerID="79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39" exitCode=0 Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.901156 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mbhpt" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.901183 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerDied","Data":"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39"} Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.901537 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mbhpt" event={"ID":"44533f42-724e-441c-bebf-902979aab424","Type":"ContainerDied","Data":"5e3306a1737e0e1d9ee981624ca80a2137791fbbe38883b385f25f3a7b63f57a"} Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.901558 4835 scope.go:117] "RemoveContainer" containerID="79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.926386 4835 scope.go:117] "RemoveContainer" containerID="1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.959987 4835 scope.go:117] "RemoveContainer" containerID="0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.969376 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content\") pod \"44533f42-724e-441c-bebf-902979aab424\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.969595 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gztl9\" (UniqueName: \"kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9\") pod \"44533f42-724e-441c-bebf-902979aab424\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.969932 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities\") pod \"44533f42-724e-441c-bebf-902979aab424\" (UID: \"44533f42-724e-441c-bebf-902979aab424\") " Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.971292 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities" (OuterVolumeSpecName: "utilities") pod "44533f42-724e-441c-bebf-902979aab424" (UID: "44533f42-724e-441c-bebf-902979aab424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:58:15 crc kubenswrapper[4835]: I0319 09:58:15.978326 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9" (OuterVolumeSpecName: "kube-api-access-gztl9") pod "44533f42-724e-441c-bebf-902979aab424" (UID: "44533f42-724e-441c-bebf-902979aab424"). InnerVolumeSpecName "kube-api-access-gztl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.033760 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44533f42-724e-441c-bebf-902979aab424" (UID: "44533f42-724e-441c-bebf-902979aab424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.074040 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.074112 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44533f42-724e-441c-bebf-902979aab424-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.074129 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gztl9\" (UniqueName: \"kubernetes.io/projected/44533f42-724e-441c-bebf-902979aab424-kube-api-access-gztl9\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.081943 4835 scope.go:117] "RemoveContainer" containerID="79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39" Mar 19 09:58:16 crc kubenswrapper[4835]: E0319 09:58:16.082511 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39\": container with ID starting with 79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39 not found: ID does not exist" containerID="79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.082561 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39"} err="failed to get container status \"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39\": rpc error: code = NotFound desc = could not find container \"79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39\": container with ID starting with 79af1f1be6eeb426538743c201c747b3d8dfb874987596920eafcd5976fcfc39 not found: ID does not exist" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.082587 4835 scope.go:117] "RemoveContainer" containerID="1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38" Mar 19 09:58:16 crc kubenswrapper[4835]: E0319 09:58:16.082942 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38\": container with ID starting with 1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38 not found: ID does not exist" containerID="1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.082986 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38"} err="failed to get container status \"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38\": rpc error: code = NotFound desc = could not find container \"1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38\": container with ID starting with 1c09c685e86960adcaa73b136f3daeb6dd5259b6c699eaaa5a3900876e0a1d38 not found: ID does not exist" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.083023 4835 scope.go:117] "RemoveContainer" containerID="0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223" Mar 19 09:58:16 crc kubenswrapper[4835]: E0319 09:58:16.083341 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223\": container with ID starting with 0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223 not found: ID does not exist" containerID="0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.083373 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223"} err="failed to get container status \"0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223\": rpc error: code = NotFound desc = could not find container \"0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223\": container with ID starting with 0acebb6f70993709e443925291f00b78077776bcb10406e00cc9aff68db68223 not found: ID does not exist" Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.241130 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.251341 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mbhpt"] Mar 19 09:58:16 crc kubenswrapper[4835]: I0319 09:58:16.437393 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44533f42-724e-441c-bebf-902979aab424" path="/var/lib/kubelet/pods/44533f42-724e-441c-bebf-902979aab424/volumes" Mar 19 09:58:20 crc kubenswrapper[4835]: I0319 09:58:20.032938 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mkr8b"] Mar 19 09:58:20 crc kubenswrapper[4835]: I0319 09:58:20.043203 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mkr8b"] Mar 19 09:58:20 crc kubenswrapper[4835]: I0319 09:58:20.426954 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2161ebbe-ed84-49a9-b0b1-b74304fd8b28" path="/var/lib/kubelet/pods/2161ebbe-ed84-49a9-b0b1-b74304fd8b28/volumes" Mar 19 09:58:36 crc kubenswrapper[4835]: I0319 09:58:36.153331 4835 generic.go:334] "Generic (PLEG): container finished" podID="1073365c-6688-438d-ad77-2c32ee7e6947" containerID="f0b9de09adc8610d146e59e2eac460f698a3fd0f4ac040ee8e453863dd443dd6" exitCode=0 Mar 19 09:58:36 crc kubenswrapper[4835]: I0319 09:58:36.153523 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" event={"ID":"1073365c-6688-438d-ad77-2c32ee7e6947","Type":"ContainerDied","Data":"f0b9de09adc8610d146e59e2eac460f698a3fd0f4ac040ee8e453863dd443dd6"} Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.665821 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.783445 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam\") pod \"1073365c-6688-438d-ad77-2c32ee7e6947\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.783591 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6snhs\" (UniqueName: \"kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs\") pod \"1073365c-6688-438d-ad77-2c32ee7e6947\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.783621 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory\") pod \"1073365c-6688-438d-ad77-2c32ee7e6947\" (UID: \"1073365c-6688-438d-ad77-2c32ee7e6947\") " Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.789136 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs" (OuterVolumeSpecName: "kube-api-access-6snhs") pod "1073365c-6688-438d-ad77-2c32ee7e6947" (UID: "1073365c-6688-438d-ad77-2c32ee7e6947"). InnerVolumeSpecName "kube-api-access-6snhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.820483 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1073365c-6688-438d-ad77-2c32ee7e6947" (UID: "1073365c-6688-438d-ad77-2c32ee7e6947"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.821499 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory" (OuterVolumeSpecName: "inventory") pod "1073365c-6688-438d-ad77-2c32ee7e6947" (UID: "1073365c-6688-438d-ad77-2c32ee7e6947"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.885926 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.885957 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6snhs\" (UniqueName: \"kubernetes.io/projected/1073365c-6688-438d-ad77-2c32ee7e6947-kube-api-access-6snhs\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:37 crc kubenswrapper[4835]: I0319 09:58:37.885967 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1073365c-6688-438d-ad77-2c32ee7e6947-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.184755 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" event={"ID":"1073365c-6688-438d-ad77-2c32ee7e6947","Type":"ContainerDied","Data":"976eb4500ccadba5aadcffd6964c1d4827f4478594963ac3d5c66c5cd2344e71"} Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.185139 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976eb4500ccadba5aadcffd6964c1d4827f4478594963ac3d5c66c5cd2344e71" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.185205 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.295923 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865"] Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296702 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebb674c-a705-418a-92ea-771ab5d01a39" containerName="oc" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296730 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebb674c-a705-418a-92ea-771ab5d01a39" containerName="oc" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296805 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="extract-content" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296818 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="extract-content" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296837 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="extract-utilities" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296849 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="extract-utilities" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296870 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296881 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296895 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="extract-utilities" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296905 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="extract-utilities" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296920 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296930 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296953 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1073365c-6688-438d-ad77-2c32ee7e6947" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296965 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1073365c-6688-438d-ad77-2c32ee7e6947" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 09:58:38 crc kubenswrapper[4835]: E0319 09:58:38.296984 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="extract-content" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.296996 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="extract-content" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.297322 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebb674c-a705-418a-92ea-771ab5d01a39" containerName="oc" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.297364 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7fb3de-fa8a-4783-ac0d-746939c9e8d4" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.297401 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1073365c-6688-438d-ad77-2c32ee7e6947" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.297426 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="44533f42-724e-441c-bebf-902979aab424" containerName="registry-server" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.298705 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.301399 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.301613 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.301789 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.301794 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.309582 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865"] Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.396558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.396700 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6pd\" (UniqueName: \"kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.396780 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.499957 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.500112 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6pd\" (UniqueName: \"kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.500167 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.504234 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.504537 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.526496 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6pd\" (UniqueName: \"kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-p6865\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:38 crc kubenswrapper[4835]: I0319 09:58:38.629836 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:58:39 crc kubenswrapper[4835]: I0319 09:58:39.176471 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865"] Mar 19 09:58:39 crc kubenswrapper[4835]: W0319 09:58:39.178131 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027e795d_3d9b_4c8f_b6ae_9b96cac6316e.slice/crio-97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4 WatchSource:0}: Error finding container 97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4: Status 404 returned error can't find the container with id 97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4 Mar 19 09:58:39 crc kubenswrapper[4835]: I0319 09:58:39.197976 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" event={"ID":"027e795d-3d9b-4c8f-b6ae-9b96cac6316e","Type":"ContainerStarted","Data":"97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4"} Mar 19 09:58:41 crc kubenswrapper[4835]: I0319 09:58:41.220508 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" event={"ID":"027e795d-3d9b-4c8f-b6ae-9b96cac6316e","Type":"ContainerStarted","Data":"c28b9557d1785c3c9251898c51bcfd04d8cbae0b3bb19e8e342392dd0a0c8c40"} Mar 19 09:58:41 crc kubenswrapper[4835]: I0319 09:58:41.240583 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" podStartSLOduration=2.347073494 podStartE2EDuration="3.240559935s" podCreationTimestamp="2026-03-19 09:58:38 +0000 UTC" firstStartedPulling="2026-03-19 09:58:39.180553168 +0000 UTC m=+2174.029151755" lastFinishedPulling="2026-03-19 09:58:40.074039609 +0000 UTC m=+2174.922638196" observedRunningTime="2026-03-19 09:58:41.235193278 +0000 UTC m=+2176.083791885" watchObservedRunningTime="2026-03-19 09:58:41.240559935 +0000 UTC m=+2176.089158542" Mar 19 09:58:54 crc kubenswrapper[4835]: I0319 09:58:54.481294 4835 scope.go:117] "RemoveContainer" containerID="b7b8402d3c5a091035650dff3c5aa76455642f20a9ee98016450c1f7bbf0fa07" Mar 19 09:58:54 crc kubenswrapper[4835]: I0319 09:58:54.510307 4835 scope.go:117] "RemoveContainer" containerID="dfcd6ca68762255b373479b75d2123ff0712b7d734ddffde0d6af98361a5b1b8" Mar 19 09:58:54 crc kubenswrapper[4835]: I0319 09:58:54.582024 4835 scope.go:117] "RemoveContainer" containerID="c00c6a6b8c36f4b1827d9a58180745faadbfa9291e59d4f2dd61dd77df2365b1" Mar 19 09:58:54 crc kubenswrapper[4835]: I0319 09:58:54.650704 4835 scope.go:117] "RemoveContainer" containerID="055506f95cd38c467b5deff11a4e9a1b6c84166480d1511368ff316282f5ce74" Mar 19 09:58:54 crc kubenswrapper[4835]: I0319 09:58:54.716907 4835 scope.go:117] "RemoveContainer" containerID="2b1fa8fa12a3487a89626ace1ecaf3bb551ec78dcfe5eb5de0a4c5f7ab6c9714" Mar 19 09:58:56 crc kubenswrapper[4835]: I0319 09:58:56.974152 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:56.984806 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.001525 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.101680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqt9\" (UniqueName: \"kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.101887 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.102090 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.204257 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.204468 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqt9\" (UniqueName: \"kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.204573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.204822 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.204914 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.226624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqt9\" (UniqueName: \"kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9\") pod \"redhat-operators-rzjl8\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.352887 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:58:57 crc kubenswrapper[4835]: I0319 09:58:57.873909 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:58:58 crc kubenswrapper[4835]: I0319 09:58:58.440981 4835 generic.go:334] "Generic (PLEG): container finished" podID="98bb8349-9742-4836-abeb-3b81408e210f" containerID="56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e" exitCode=0 Mar 19 09:58:58 crc kubenswrapper[4835]: I0319 09:58:58.442071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerDied","Data":"56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e"} Mar 19 09:58:58 crc kubenswrapper[4835]: I0319 09:58:58.442927 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerStarted","Data":"57849417d831d40aa751ff795ec3141002a103016e8683e09c0c8402144b2098"} Mar 19 09:59:00 crc kubenswrapper[4835]: I0319 09:59:00.463565 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerStarted","Data":"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9"} Mar 19 09:59:05 crc kubenswrapper[4835]: I0319 09:59:05.546292 4835 generic.go:334] "Generic (PLEG): container finished" podID="98bb8349-9742-4836-abeb-3b81408e210f" containerID="33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9" exitCode=0 Mar 19 09:59:05 crc kubenswrapper[4835]: I0319 09:59:05.546368 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerDied","Data":"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9"} Mar 19 09:59:06 crc kubenswrapper[4835]: I0319 09:59:06.562246 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerStarted","Data":"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f"} Mar 19 09:59:06 crc kubenswrapper[4835]: I0319 09:59:06.590287 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rzjl8" podStartSLOduration=3.091565308 podStartE2EDuration="10.590268717s" podCreationTimestamp="2026-03-19 09:58:56 +0000 UTC" firstStartedPulling="2026-03-19 09:58:58.445908849 +0000 UTC m=+2193.294507456" lastFinishedPulling="2026-03-19 09:59:05.944612268 +0000 UTC m=+2200.793210865" observedRunningTime="2026-03-19 09:59:06.584937681 +0000 UTC m=+2201.433536288" watchObservedRunningTime="2026-03-19 09:59:06.590268717 +0000 UTC m=+2201.438867304" Mar 19 09:59:07 crc kubenswrapper[4835]: I0319 09:59:07.353667 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:07 crc kubenswrapper[4835]: I0319 09:59:07.353977 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:08 crc kubenswrapper[4835]: I0319 09:59:08.415923 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzjl8" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" probeResult="failure" output=< Mar 19 09:59:08 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:59:08 crc kubenswrapper[4835]: > Mar 19 09:59:18 crc kubenswrapper[4835]: I0319 09:59:18.418105 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzjl8" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" probeResult="failure" output=< Mar 19 09:59:18 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:59:18 crc kubenswrapper[4835]: > Mar 19 09:59:28 crc kubenswrapper[4835]: I0319 09:59:28.406998 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rzjl8" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" probeResult="failure" output=< Mar 19 09:59:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 09:59:28 crc kubenswrapper[4835]: > Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.066373 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gdrnz"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.080025 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-rvdwp"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.091835 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-62ed-account-create-update-8hlr5"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.104317 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d261-account-create-update-tskmn"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.116349 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gdrnz"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.127618 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-62ed-account-create-update-8hlr5"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.138236 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-rvdwp"] Mar 19 09:59:29 crc kubenswrapper[4835]: I0319 09:59:29.151312 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d261-account-create-update-tskmn"] Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.040386 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-38a1-account-create-update-csb2n"] Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.060463 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-38a1-account-create-update-csb2n"] Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.450463 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d11e7d-9d08-4f99-b447-0a80feec9c95" path="/var/lib/kubelet/pods/25d11e7d-9d08-4f99-b447-0a80feec9c95/volumes" Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.451687 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c3ca948-8ce8-402a-995a-ffdfe23ceaa5" path="/var/lib/kubelet/pods/6c3ca948-8ce8-402a-995a-ffdfe23ceaa5/volumes" Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.453131 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755c8e66-e5e7-491b-9d77-a4b702ef6b92" path="/var/lib/kubelet/pods/755c8e66-e5e7-491b-9d77-a4b702ef6b92/volumes" Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.454056 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905929b3-4296-4ec7-9532-d7ad6e80ad84" path="/var/lib/kubelet/pods/905929b3-4296-4ec7-9532-d7ad6e80ad84/volumes" Mar 19 09:59:30 crc kubenswrapper[4835]: I0319 09:59:30.455450 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f917b1a5-922f-444e-b621-b7a89e8df050" path="/var/lib/kubelet/pods/f917b1a5-922f-444e-b621-b7a89e8df050/volumes" Mar 19 09:59:31 crc kubenswrapper[4835]: I0319 09:59:31.028722 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kxp5r"] Mar 19 09:59:31 crc kubenswrapper[4835]: I0319 09:59:31.039515 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kxp5r"] Mar 19 09:59:32 crc kubenswrapper[4835]: I0319 09:59:32.415293 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097405cc-a2f0-4f58-875e-b9433a53bde9" path="/var/lib/kubelet/pods/097405cc-a2f0-4f58-875e-b9433a53bde9/volumes" Mar 19 09:59:36 crc kubenswrapper[4835]: I0319 09:59:36.422254 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 09:59:36 crc kubenswrapper[4835]: I0319 09:59:36.424715 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 09:59:37 crc kubenswrapper[4835]: I0319 09:59:37.417406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:37 crc kubenswrapper[4835]: I0319 09:59:37.469039 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:37 crc kubenswrapper[4835]: I0319 09:59:37.658478 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:59:38 crc kubenswrapper[4835]: I0319 09:59:38.920845 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rzjl8" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" containerID="cri-o://d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f" gracePeriod=2 Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.550128 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.680624 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqt9\" (UniqueName: \"kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9\") pod \"98bb8349-9742-4836-abeb-3b81408e210f\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.680691 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content\") pod \"98bb8349-9742-4836-abeb-3b81408e210f\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.680855 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities\") pod \"98bb8349-9742-4836-abeb-3b81408e210f\" (UID: \"98bb8349-9742-4836-abeb-3b81408e210f\") " Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.681481 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities" (OuterVolumeSpecName: "utilities") pod "98bb8349-9742-4836-abeb-3b81408e210f" (UID: "98bb8349-9742-4836-abeb-3b81408e210f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.692454 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9" (OuterVolumeSpecName: "kube-api-access-6lqt9") pod "98bb8349-9742-4836-abeb-3b81408e210f" (UID: "98bb8349-9742-4836-abeb-3b81408e210f"). InnerVolumeSpecName "kube-api-access-6lqt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.784108 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.784143 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqt9\" (UniqueName: \"kubernetes.io/projected/98bb8349-9742-4836-abeb-3b81408e210f-kube-api-access-6lqt9\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.801411 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98bb8349-9742-4836-abeb-3b81408e210f" (UID: "98bb8349-9742-4836-abeb-3b81408e210f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.886363 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98bb8349-9742-4836-abeb-3b81408e210f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.935159 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rzjl8" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.935238 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerDied","Data":"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f"} Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.935944 4835 scope.go:117] "RemoveContainer" containerID="d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.935255 4835 generic.go:334] "Generic (PLEG): container finished" podID="98bb8349-9742-4836-abeb-3b81408e210f" containerID="d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f" exitCode=0 Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.936124 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rzjl8" event={"ID":"98bb8349-9742-4836-abeb-3b81408e210f","Type":"ContainerDied","Data":"57849417d831d40aa751ff795ec3141002a103016e8683e09c0c8402144b2098"} Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.966789 4835 scope.go:117] "RemoveContainer" containerID="33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9" Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.975459 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:59:39 crc kubenswrapper[4835]: I0319 09:59:39.987854 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rzjl8"] Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.009305 4835 scope.go:117] "RemoveContainer" containerID="56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.046226 4835 scope.go:117] "RemoveContainer" containerID="d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f" Mar 19 09:59:40 crc kubenswrapper[4835]: E0319 09:59:40.046705 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f\": container with ID starting with d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f not found: ID does not exist" containerID="d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.046778 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f"} err="failed to get container status \"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f\": rpc error: code = NotFound desc = could not find container \"d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f\": container with ID starting with d4691d1b8ed2d93142b35bf596dcce9965052b9726faab3c984d0ff89997695f not found: ID does not exist" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.046806 4835 scope.go:117] "RemoveContainer" containerID="33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9" Mar 19 09:59:40 crc kubenswrapper[4835]: E0319 09:59:40.047183 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9\": container with ID starting with 33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9 not found: ID does not exist" containerID="33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.047217 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9"} err="failed to get container status \"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9\": rpc error: code = NotFound desc = could not find container \"33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9\": container with ID starting with 33971f7d9f1964a535e920cbef5befa012bf606b051ace6226b735dc48d5c3b9 not found: ID does not exist" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.047233 4835 scope.go:117] "RemoveContainer" containerID="56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e" Mar 19 09:59:40 crc kubenswrapper[4835]: E0319 09:59:40.047586 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e\": container with ID starting with 56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e not found: ID does not exist" containerID="56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.047637 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e"} err="failed to get container status \"56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e\": rpc error: code = NotFound desc = could not find container \"56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e\": container with ID starting with 56312db7aebded1e16b6c303e35c5cc4b3317b0dda548aa43b4e1d5286ac6b2e not found: ID does not exist" Mar 19 09:59:40 crc kubenswrapper[4835]: I0319 09:59:40.413928 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bb8349-9742-4836-abeb-3b81408e210f" path="/var/lib/kubelet/pods/98bb8349-9742-4836-abeb-3b81408e210f/volumes" Mar 19 09:59:44 crc kubenswrapper[4835]: I0319 09:59:44.988958 4835 generic.go:334] "Generic (PLEG): container finished" podID="027e795d-3d9b-4c8f-b6ae-9b96cac6316e" containerID="c28b9557d1785c3c9251898c51bcfd04d8cbae0b3bb19e8e342392dd0a0c8c40" exitCode=0 Mar 19 09:59:44 crc kubenswrapper[4835]: I0319 09:59:44.989029 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" event={"ID":"027e795d-3d9b-4c8f-b6ae-9b96cac6316e","Type":"ContainerDied","Data":"c28b9557d1785c3c9251898c51bcfd04d8cbae0b3bb19e8e342392dd0a0c8c40"} Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.550978 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.651043 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory\") pod \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.651301 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t6pd\" (UniqueName: \"kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd\") pod \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.651391 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam\") pod \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\" (UID: \"027e795d-3d9b-4c8f-b6ae-9b96cac6316e\") " Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.659774 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd" (OuterVolumeSpecName: "kube-api-access-2t6pd") pod "027e795d-3d9b-4c8f-b6ae-9b96cac6316e" (UID: "027e795d-3d9b-4c8f-b6ae-9b96cac6316e"). InnerVolumeSpecName "kube-api-access-2t6pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.686845 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory" (OuterVolumeSpecName: "inventory") pod "027e795d-3d9b-4c8f-b6ae-9b96cac6316e" (UID: "027e795d-3d9b-4c8f-b6ae-9b96cac6316e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.698373 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "027e795d-3d9b-4c8f-b6ae-9b96cac6316e" (UID: "027e795d-3d9b-4c8f-b6ae-9b96cac6316e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.754538 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t6pd\" (UniqueName: \"kubernetes.io/projected/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-kube-api-access-2t6pd\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.754579 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:46 crc kubenswrapper[4835]: I0319 09:59:46.754595 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/027e795d-3d9b-4c8f-b6ae-9b96cac6316e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.015786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" event={"ID":"027e795d-3d9b-4c8f-b6ae-9b96cac6316e","Type":"ContainerDied","Data":"97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4"} Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.015824 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97a42474851118106dc45fb5e87c774e3e46a89e1f644f75b75796c782445aa4" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.015874 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-p6865" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.121098 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c"] Mar 19 09:59:47 crc kubenswrapper[4835]: E0319 09:59:47.121640 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="extract-content" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.121668 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="extract-content" Mar 19 09:59:47 crc kubenswrapper[4835]: E0319 09:59:47.121714 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027e795d-3d9b-4c8f-b6ae-9b96cac6316e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.121728 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="027e795d-3d9b-4c8f-b6ae-9b96cac6316e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:47 crc kubenswrapper[4835]: E0319 09:59:47.123677 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.123727 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" Mar 19 09:59:47 crc kubenswrapper[4835]: E0319 09:59:47.123823 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="extract-utilities" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.123840 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="extract-utilities" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.124333 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="027e795d-3d9b-4c8f-b6ae-9b96cac6316e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.124407 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bb8349-9742-4836-abeb-3b81408e210f" containerName="registry-server" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.125553 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.130272 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.135550 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.135609 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.135809 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.140850 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c"] Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.163287 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4f2\" (UniqueName: \"kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.163424 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.163482 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.265525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.265730 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4f2\" (UniqueName: \"kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.265832 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.270780 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.271464 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.283001 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4f2\" (UniqueName: \"kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:47 crc kubenswrapper[4835]: I0319 09:59:47.464290 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:48 crc kubenswrapper[4835]: I0319 09:59:48.013658 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c"] Mar 19 09:59:48 crc kubenswrapper[4835]: I0319 09:59:48.031997 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" event={"ID":"1530e76d-2520-4d3a-9aff-ef734194267d","Type":"ContainerStarted","Data":"02af9bddfd3372190836bcea28396809b85b28f62209e5e544ebe7ff1b2f079d"} Mar 19 09:59:49 crc kubenswrapper[4835]: I0319 09:59:49.073379 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" event={"ID":"1530e76d-2520-4d3a-9aff-ef734194267d","Type":"ContainerStarted","Data":"b0ce92eb0e8d210f34f46025b2e93facfd66b56046ece8e5893f6087db0fe876"} Mar 19 09:59:49 crc kubenswrapper[4835]: I0319 09:59:49.108281 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" podStartSLOduration=1.577354634 podStartE2EDuration="2.108261206s" podCreationTimestamp="2026-03-19 09:59:47 +0000 UTC" firstStartedPulling="2026-03-19 09:59:48.016271104 +0000 UTC m=+2242.864869691" lastFinishedPulling="2026-03-19 09:59:48.547177676 +0000 UTC m=+2243.395776263" observedRunningTime="2026-03-19 09:59:49.099560018 +0000 UTC m=+2243.948158605" watchObservedRunningTime="2026-03-19 09:59:49.108261206 +0000 UTC m=+2243.956859793" Mar 19 09:59:54 crc kubenswrapper[4835]: I0319 09:59:54.134391 4835 generic.go:334] "Generic (PLEG): container finished" podID="1530e76d-2520-4d3a-9aff-ef734194267d" containerID="b0ce92eb0e8d210f34f46025b2e93facfd66b56046ece8e5893f6087db0fe876" exitCode=0 Mar 19 09:59:54 crc kubenswrapper[4835]: I0319 09:59:54.134464 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" event={"ID":"1530e76d-2520-4d3a-9aff-ef734194267d","Type":"ContainerDied","Data":"b0ce92eb0e8d210f34f46025b2e93facfd66b56046ece8e5893f6087db0fe876"} Mar 19 09:59:54 crc kubenswrapper[4835]: I0319 09:59:54.938634 4835 scope.go:117] "RemoveContainer" containerID="62013a64e403873a78d0a883c182c65f16986079912450b96c7878082449b42c" Mar 19 09:59:54 crc kubenswrapper[4835]: I0319 09:59:54.972607 4835 scope.go:117] "RemoveContainer" containerID="e38fc7e1c8eceb13aaeca3ac1b819419c6c411847893dd642c51685b1de2119c" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.030676 4835 scope.go:117] "RemoveContainer" containerID="b3f18845241f81076d63e42367c4a9c57e07b759e305dfdddecbe577b512e3a5" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.090554 4835 scope.go:117] "RemoveContainer" containerID="776d6925a324fb465f8d28244bd55ae19a439e475e5f0aba8f8306aa99d0f8c8" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.145422 4835 scope.go:117] "RemoveContainer" containerID="39eca1cbcc12413b3409d31da7b6a1fa5e14b10da642233afc209e77be3fdea7" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.195308 4835 scope.go:117] "RemoveContainer" containerID="6d0fa28b880c3a0a7ce0ed4caec5d24109a7dfc5741ff7f60c5b3b2c46f62de6" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.656689 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.699914 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory\") pod \"1530e76d-2520-4d3a-9aff-ef734194267d\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.700112 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam\") pod \"1530e76d-2520-4d3a-9aff-ef734194267d\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.700303 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4f2\" (UniqueName: \"kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2\") pod \"1530e76d-2520-4d3a-9aff-ef734194267d\" (UID: \"1530e76d-2520-4d3a-9aff-ef734194267d\") " Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.705987 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2" (OuterVolumeSpecName: "kube-api-access-xs4f2") pod "1530e76d-2520-4d3a-9aff-ef734194267d" (UID: "1530e76d-2520-4d3a-9aff-ef734194267d"). InnerVolumeSpecName "kube-api-access-xs4f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.738296 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1530e76d-2520-4d3a-9aff-ef734194267d" (UID: "1530e76d-2520-4d3a-9aff-ef734194267d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.740402 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory" (OuterVolumeSpecName: "inventory") pod "1530e76d-2520-4d3a-9aff-ef734194267d" (UID: "1530e76d-2520-4d3a-9aff-ef734194267d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.804658 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.804701 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1530e76d-2520-4d3a-9aff-ef734194267d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:55 crc kubenswrapper[4835]: I0319 09:59:55.804714 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4f2\" (UniqueName: \"kubernetes.io/projected/1530e76d-2520-4d3a-9aff-ef734194267d-kube-api-access-xs4f2\") on node \"crc\" DevicePath \"\"" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.171930 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" event={"ID":"1530e76d-2520-4d3a-9aff-ef734194267d","Type":"ContainerDied","Data":"02af9bddfd3372190836bcea28396809b85b28f62209e5e544ebe7ff1b2f079d"} Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.171967 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02af9bddfd3372190836bcea28396809b85b28f62209e5e544ebe7ff1b2f079d" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.172015 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.259558 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t"] Mar 19 09:59:56 crc kubenswrapper[4835]: E0319 09:59:56.260238 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1530e76d-2520-4d3a-9aff-ef734194267d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.260267 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1530e76d-2520-4d3a-9aff-ef734194267d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.260579 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1530e76d-2520-4d3a-9aff-ef734194267d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.261662 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.263925 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.264202 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.264917 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.265233 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.288823 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t"] Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.315975 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.316036 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.316205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpvhz\" (UniqueName: \"kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.418279 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpvhz\" (UniqueName: \"kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.418833 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.418924 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.425895 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.425907 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.434325 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpvhz\" (UniqueName: \"kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qvj9t\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:56 crc kubenswrapper[4835]: I0319 09:59:56.583113 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 09:59:57 crc kubenswrapper[4835]: I0319 09:59:57.205949 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t"] Mar 19 09:59:58 crc kubenswrapper[4835]: I0319 09:59:58.209012 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" event={"ID":"d16f3bd5-80dd-4532-a60e-0a6d214ef185","Type":"ContainerStarted","Data":"1b580a87dc51bc21e8783b7284787d35bb63ffc4de20f473b8b8f0a25c20a3a1"} Mar 19 09:59:58 crc kubenswrapper[4835]: I0319 09:59:58.209510 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" event={"ID":"d16f3bd5-80dd-4532-a60e-0a6d214ef185","Type":"ContainerStarted","Data":"926eab125ff020acc3c7e771139d16c206135602c4b12ad4f89d050622fae6e6"} Mar 19 09:59:58 crc kubenswrapper[4835]: I0319 09:59:58.236255 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" podStartSLOduration=1.796880815 podStartE2EDuration="2.236234328s" podCreationTimestamp="2026-03-19 09:59:56 +0000 UTC" firstStartedPulling="2026-03-19 09:59:57.213867713 +0000 UTC m=+2252.062466300" lastFinishedPulling="2026-03-19 09:59:57.653221236 +0000 UTC m=+2252.501819813" observedRunningTime="2026-03-19 09:59:58.225865943 +0000 UTC m=+2253.074464530" watchObservedRunningTime="2026-03-19 09:59:58.236234328 +0000 UTC m=+2253.084832915" Mar 19 09:59:59 crc kubenswrapper[4835]: I0319 09:59:59.044969 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-vx7t2"] Mar 19 09:59:59 crc kubenswrapper[4835]: I0319 09:59:59.058287 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-vx7t2"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.035890 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-c643-account-create-update-rvjfw"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.052923 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-c643-account-create-update-rvjfw"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.132921 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565240-rvtrz"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.134583 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.138532 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.138537 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.139086 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.142755 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.144663 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.149846 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.150371 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.153085 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565240-rvtrz"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.164252 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf"] Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.229935 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87cz9\" (UniqueName: \"kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9\") pod \"auto-csr-approver-29565240-rvtrz\" (UID: \"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48\") " pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.229995 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.230103 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.230144 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqk5c\" (UniqueName: \"kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.333148 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87cz9\" (UniqueName: \"kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9\") pod \"auto-csr-approver-29565240-rvtrz\" (UID: \"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48\") " pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.333197 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.333272 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.333297 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqk5c\" (UniqueName: \"kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.335500 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.338875 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.348655 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqk5c\" (UniqueName: \"kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c\") pod \"collect-profiles-29565240-88tjf\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.352955 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87cz9\" (UniqueName: \"kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9\") pod \"auto-csr-approver-29565240-rvtrz\" (UID: \"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48\") " pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.417597 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2740571b-8f0c-4e05-a01a-659337fd2d6e" path="/var/lib/kubelet/pods/2740571b-8f0c-4e05-a01a-659337fd2d6e/volumes" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.419507 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2715f44-163f-4954-aa37-86a59c886a56" path="/var/lib/kubelet/pods/c2715f44-163f-4954-aa37-86a59c886a56/volumes" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.458311 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.474258 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:00 crc kubenswrapper[4835]: I0319 10:00:00.950391 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565240-rvtrz"] Mar 19 10:00:01 crc kubenswrapper[4835]: I0319 10:00:01.084048 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf"] Mar 19 10:00:01 crc kubenswrapper[4835]: W0319 10:00:01.086302 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9cb21a9_c134_49a3_b8b1_c9289e9a52a2.slice/crio-d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb WatchSource:0}: Error finding container d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb: Status 404 returned error can't find the container with id d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb Mar 19 10:00:01 crc kubenswrapper[4835]: I0319 10:00:01.241224 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" event={"ID":"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2","Type":"ContainerStarted","Data":"d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb"} Mar 19 10:00:01 crc kubenswrapper[4835]: I0319 10:00:01.242726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" event={"ID":"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48","Type":"ContainerStarted","Data":"1ca94fadb73f90adaf8d5cf5c5aab4b39df608fa5b7d4a5e7e6f42d9666f5a7c"} Mar 19 10:00:02 crc kubenswrapper[4835]: I0319 10:00:02.033219 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qfttg"] Mar 19 10:00:02 crc kubenswrapper[4835]: I0319 10:00:02.045580 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qfttg"] Mar 19 10:00:02 crc kubenswrapper[4835]: I0319 10:00:02.257277 4835 generic.go:334] "Generic (PLEG): container finished" podID="c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" containerID="a1c8561e2c3cc7d94620750aac17b05cfdad1945bd793b6ea0f0dbe435d81f05" exitCode=0 Mar 19 10:00:02 crc kubenswrapper[4835]: I0319 10:00:02.257356 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" event={"ID":"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2","Type":"ContainerDied","Data":"a1c8561e2c3cc7d94620750aac17b05cfdad1945bd793b6ea0f0dbe435d81f05"} Mar 19 10:00:02 crc kubenswrapper[4835]: I0319 10:00:02.421403 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda" path="/var/lib/kubelet/pods/2c7b7fbe-fa2c-43bd-b0a4-2027fff59bda/volumes" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.799733 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.831244 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume\") pod \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.831522 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume\") pod \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.831575 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqk5c\" (UniqueName: \"kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c\") pod \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\" (UID: \"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2\") " Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.832413 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" (UID: "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.838273 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" (UID: "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.838677 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c" (OuterVolumeSpecName: "kube-api-access-xqk5c") pod "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" (UID: "c9cb21a9-c134-49a3-b8b1-c9289e9a52a2"). InnerVolumeSpecName "kube-api-access-xqk5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.934706 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.934749 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqk5c\" (UniqueName: \"kubernetes.io/projected/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-kube-api-access-xqk5c\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:03 crc kubenswrapper[4835]: I0319 10:00:03.934778 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:04 crc kubenswrapper[4835]: I0319 10:00:04.279865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" event={"ID":"c9cb21a9-c134-49a3-b8b1-c9289e9a52a2","Type":"ContainerDied","Data":"d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb"} Mar 19 10:00:04 crc kubenswrapper[4835]: I0319 10:00:04.280190 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42e44cd2f0a0f0c35f35e933e3f02cb88143b0e3b3566efaac3a008c5cfbdfb" Mar 19 10:00:04 crc kubenswrapper[4835]: I0319 10:00:04.279924 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf" Mar 19 10:00:04 crc kubenswrapper[4835]: I0319 10:00:04.877530 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2"] Mar 19 10:00:04 crc kubenswrapper[4835]: I0319 10:00:04.891032 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565195-xtph2"] Mar 19 10:00:06 crc kubenswrapper[4835]: I0319 10:00:06.420538 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23206b12-5e7f-4ce8-aa17-7eac028bd9fe" path="/var/lib/kubelet/pods/23206b12-5e7f-4ce8-aa17-7eac028bd9fe/volumes" Mar 19 10:00:06 crc kubenswrapper[4835]: I0319 10:00:06.422084 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:00:06 crc kubenswrapper[4835]: I0319 10:00:06.422147 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:00:17 crc kubenswrapper[4835]: I0319 10:00:17.414460 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" event={"ID":"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48","Type":"ContainerStarted","Data":"f5ddcb8280af6869e8aab956ca92dccdfbe1e9ecd1410e663126b231e0bd8dcd"} Mar 19 10:00:17 crc kubenswrapper[4835]: I0319 10:00:17.438375 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" podStartSLOduration=1.332967724 podStartE2EDuration="17.438357395s" podCreationTimestamp="2026-03-19 10:00:00 +0000 UTC" firstStartedPulling="2026-03-19 10:00:00.951573869 +0000 UTC m=+2255.800172446" lastFinishedPulling="2026-03-19 10:00:17.05696351 +0000 UTC m=+2271.905562117" observedRunningTime="2026-03-19 10:00:17.429284347 +0000 UTC m=+2272.277882934" watchObservedRunningTime="2026-03-19 10:00:17.438357395 +0000 UTC m=+2272.286955972" Mar 19 10:00:18 crc kubenswrapper[4835]: I0319 10:00:18.436019 4835 generic.go:334] "Generic (PLEG): container finished" podID="49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" containerID="f5ddcb8280af6869e8aab956ca92dccdfbe1e9ecd1410e663126b231e0bd8dcd" exitCode=0 Mar 19 10:00:18 crc kubenswrapper[4835]: I0319 10:00:18.436181 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" event={"ID":"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48","Type":"ContainerDied","Data":"f5ddcb8280af6869e8aab956ca92dccdfbe1e9ecd1410e663126b231e0bd8dcd"} Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.149735 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.280569 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87cz9\" (UniqueName: \"kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9\") pod \"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48\" (UID: \"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48\") " Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.287519 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9" (OuterVolumeSpecName: "kube-api-access-87cz9") pod "49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" (UID: "49cc6c72-e73e-47a0-9f4d-f66f7a47bf48"). InnerVolumeSpecName "kube-api-access-87cz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.384173 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87cz9\" (UniqueName: \"kubernetes.io/projected/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48-kube-api-access-87cz9\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.458902 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" event={"ID":"49cc6c72-e73e-47a0-9f4d-f66f7a47bf48","Type":"ContainerDied","Data":"1ca94fadb73f90adaf8d5cf5c5aab4b39df608fa5b7d4a5e7e6f42d9666f5a7c"} Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.458966 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ca94fadb73f90adaf8d5cf5c5aab4b39df608fa5b7d4a5e7e6f42d9666f5a7c" Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.459050 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565240-rvtrz" Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.523108 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565234-jlc9r"] Mar 19 10:00:20 crc kubenswrapper[4835]: I0319 10:00:20.537675 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565234-jlc9r"] Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.054386 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-q8bz6"] Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.063247 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-q8bz6"] Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.417620 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385f4a9d-4a91-4e88-a364-9f54b2a1a1cb" path="/var/lib/kubelet/pods/385f4a9d-4a91-4e88-a364-9f54b2a1a1cb/volumes" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.418788 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a19ce6a-83f1-4add-be81-3d5328c2c8fd" path="/var/lib/kubelet/pods/7a19ce6a-83f1-4add-be81-3d5328c2c8fd/volumes" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.659068 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:22 crc kubenswrapper[4835]: E0319 10:00:22.659570 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" containerName="collect-profiles" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.659590 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" containerName="collect-profiles" Mar 19 10:00:22 crc kubenswrapper[4835]: E0319 10:00:22.659634 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" containerName="oc" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.659641 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" containerName="oc" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.659927 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" containerName="collect-profiles" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.659951 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" containerName="oc" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.661901 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.687032 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.743859 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrxf\" (UniqueName: \"kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.744065 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.744097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.846776 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.846836 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.847063 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrxf\" (UniqueName: \"kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.847294 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.847589 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.867469 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrxf\" (UniqueName: \"kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf\") pod \"redhat-marketplace-8ttf4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:22 crc kubenswrapper[4835]: I0319 10:00:22.996180 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:23 crc kubenswrapper[4835]: W0319 10:00:23.553329 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb941ce_1892_446a_b619_7d04b2e962a4.slice/crio-9df4009e69a0c087bbfab73be765a705f6afa7a57f555111f3d31fb927f43db9 WatchSource:0}: Error finding container 9df4009e69a0c087bbfab73be765a705f6afa7a57f555111f3d31fb927f43db9: Status 404 returned error can't find the container with id 9df4009e69a0c087bbfab73be765a705f6afa7a57f555111f3d31fb927f43db9 Mar 19 10:00:23 crc kubenswrapper[4835]: I0319 10:00:23.553653 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:24 crc kubenswrapper[4835]: I0319 10:00:24.521262 4835 generic.go:334] "Generic (PLEG): container finished" podID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerID="9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d" exitCode=0 Mar 19 10:00:24 crc kubenswrapper[4835]: I0319 10:00:24.521447 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerDied","Data":"9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d"} Mar 19 10:00:24 crc kubenswrapper[4835]: I0319 10:00:24.521830 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerStarted","Data":"9df4009e69a0c087bbfab73be765a705f6afa7a57f555111f3d31fb927f43db9"} Mar 19 10:00:26 crc kubenswrapper[4835]: I0319 10:00:26.045221 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d85wk"] Mar 19 10:00:26 crc kubenswrapper[4835]: I0319 10:00:26.060005 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d85wk"] Mar 19 10:00:26 crc kubenswrapper[4835]: I0319 10:00:26.431019 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad8f4bf-8deb-49f9-9bbc-a72db22918cd" path="/var/lib/kubelet/pods/bad8f4bf-8deb-49f9-9bbc-a72db22918cd/volumes" Mar 19 10:00:26 crc kubenswrapper[4835]: I0319 10:00:26.548232 4835 generic.go:334] "Generic (PLEG): container finished" podID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerID="452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1" exitCode=0 Mar 19 10:00:26 crc kubenswrapper[4835]: I0319 10:00:26.548288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerDied","Data":"452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1"} Mar 19 10:00:28 crc kubenswrapper[4835]: I0319 10:00:28.571779 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerStarted","Data":"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371"} Mar 19 10:00:28 crc kubenswrapper[4835]: I0319 10:00:28.593246 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ttf4" podStartSLOduration=3.525483094 podStartE2EDuration="6.593220715s" podCreationTimestamp="2026-03-19 10:00:22 +0000 UTC" firstStartedPulling="2026-03-19 10:00:24.523154529 +0000 UTC m=+2279.371753116" lastFinishedPulling="2026-03-19 10:00:27.59089215 +0000 UTC m=+2282.439490737" observedRunningTime="2026-03-19 10:00:28.590329236 +0000 UTC m=+2283.438927813" watchObservedRunningTime="2026-03-19 10:00:28.593220715 +0000 UTC m=+2283.441819302" Mar 19 10:00:32 crc kubenswrapper[4835]: I0319 10:00:32.997314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:32 crc kubenswrapper[4835]: I0319 10:00:32.998282 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:33 crc kubenswrapper[4835]: I0319 10:00:33.062499 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:33 crc kubenswrapper[4835]: I0319 10:00:33.692013 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:33 crc kubenswrapper[4835]: I0319 10:00:33.745981 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:34 crc kubenswrapper[4835]: I0319 10:00:34.642661 4835 generic.go:334] "Generic (PLEG): container finished" podID="d16f3bd5-80dd-4532-a60e-0a6d214ef185" containerID="1b580a87dc51bc21e8783b7284787d35bb63ffc4de20f473b8b8f0a25c20a3a1" exitCode=0 Mar 19 10:00:34 crc kubenswrapper[4835]: I0319 10:00:34.642793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" event={"ID":"d16f3bd5-80dd-4532-a60e-0a6d214ef185","Type":"ContainerDied","Data":"1b580a87dc51bc21e8783b7284787d35bb63ffc4de20f473b8b8f0a25c20a3a1"} Mar 19 10:00:35 crc kubenswrapper[4835]: I0319 10:00:35.659570 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ttf4" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="registry-server" containerID="cri-o://40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371" gracePeriod=2 Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.341770 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.349038 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406230 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content\") pod \"7eb941ce-1892-446a-b619-7d04b2e962a4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406308 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory\") pod \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406457 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vrxf\" (UniqueName: \"kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf\") pod \"7eb941ce-1892-446a-b619-7d04b2e962a4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities\") pod \"7eb941ce-1892-446a-b619-7d04b2e962a4\" (UID: \"7eb941ce-1892-446a-b619-7d04b2e962a4\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406516 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam\") pod \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.406600 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpvhz\" (UniqueName: \"kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz\") pod \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\" (UID: \"d16f3bd5-80dd-4532-a60e-0a6d214ef185\") " Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.407971 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities" (OuterVolumeSpecName: "utilities") pod "7eb941ce-1892-446a-b619-7d04b2e962a4" (UID: "7eb941ce-1892-446a-b619-7d04b2e962a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.415695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz" (OuterVolumeSpecName: "kube-api-access-mpvhz") pod "d16f3bd5-80dd-4532-a60e-0a6d214ef185" (UID: "d16f3bd5-80dd-4532-a60e-0a6d214ef185"). InnerVolumeSpecName "kube-api-access-mpvhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.415799 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf" (OuterVolumeSpecName: "kube-api-access-6vrxf") pod "7eb941ce-1892-446a-b619-7d04b2e962a4" (UID: "7eb941ce-1892-446a-b619-7d04b2e962a4"). InnerVolumeSpecName "kube-api-access-6vrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.423779 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.423890 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.464097 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eb941ce-1892-446a-b619-7d04b2e962a4" (UID: "7eb941ce-1892-446a-b619-7d04b2e962a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.488919 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d16f3bd5-80dd-4532-a60e-0a6d214ef185" (UID: "d16f3bd5-80dd-4532-a60e-0a6d214ef185"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.517124 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.517164 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vrxf\" (UniqueName: \"kubernetes.io/projected/7eb941ce-1892-446a-b619-7d04b2e962a4-kube-api-access-6vrxf\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.517178 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb941ce-1892-446a-b619-7d04b2e962a4-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.517189 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.517200 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpvhz\" (UniqueName: \"kubernetes.io/projected/d16f3bd5-80dd-4532-a60e-0a6d214ef185-kube-api-access-mpvhz\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.531172 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory" (OuterVolumeSpecName: "inventory") pod "d16f3bd5-80dd-4532-a60e-0a6d214ef185" (UID: "d16f3bd5-80dd-4532-a60e-0a6d214ef185"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.560025 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.560601 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.560669 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" gracePeriod=600 Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.621669 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d16f3bd5-80dd-4532-a60e-0a6d214ef185-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.672808 4835 generic.go:334] "Generic (PLEG): container finished" podID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerID="40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371" exitCode=0 Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.672899 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ttf4" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.672944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerDied","Data":"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371"} Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.672973 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ttf4" event={"ID":"7eb941ce-1892-446a-b619-7d04b2e962a4","Type":"ContainerDied","Data":"9df4009e69a0c087bbfab73be765a705f6afa7a57f555111f3d31fb927f43db9"} Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.673008 4835 scope.go:117] "RemoveContainer" containerID="40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.684860 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.684971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qvj9t" event={"ID":"d16f3bd5-80dd-4532-a60e-0a6d214ef185","Type":"ContainerDied","Data":"926eab125ff020acc3c7e771139d16c206135602c4b12ad4f89d050622fae6e6"} Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.685157 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926eab125ff020acc3c7e771139d16c206135602c4b12ad4f89d050622fae6e6" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.691319 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.709567 4835 scope.go:117] "RemoveContainer" containerID="452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.734984 4835 scope.go:117] "RemoveContainer" containerID="9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.742145 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.760371 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ttf4"] Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.763481 4835 scope.go:117] "RemoveContainer" containerID="40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.764093 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371\": container with ID starting with 40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371 not found: ID does not exist" containerID="40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.764120 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371"} err="failed to get container status \"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371\": rpc error: code = NotFound desc = could not find container \"40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371\": container with ID starting with 40624b2553b9d5626386a3a216b73b3a5e7bcbcbbf91e63d9deb1da67b774371 not found: ID does not exist" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.764142 4835 scope.go:117] "RemoveContainer" containerID="452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.764577 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1\": container with ID starting with 452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1 not found: ID does not exist" containerID="452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.764596 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1"} err="failed to get container status \"452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1\": rpc error: code = NotFound desc = could not find container \"452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1\": container with ID starting with 452f47f00e8cf2132d6cc350845f84eb00761fedf0baf54a14d602e5d68870c1 not found: ID does not exist" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.764611 4835 scope.go:117] "RemoveContainer" containerID="9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.764924 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d\": container with ID starting with 9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d not found: ID does not exist" containerID="9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.764963 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d"} err="failed to get container status \"9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d\": rpc error: code = NotFound desc = could not find container \"9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d\": container with ID starting with 9220fa03ce972ed3fb6b755df5df6e9eeeda7d43deb08d6ac304f9fb7ecc410d not found: ID does not exist" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.794850 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t"] Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.795406 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="registry-server" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795425 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="registry-server" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.795455 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="extract-content" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795463 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="extract-content" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.795500 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16f3bd5-80dd-4532-a60e-0a6d214ef185" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795509 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16f3bd5-80dd-4532-a60e-0a6d214ef185" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:00:36 crc kubenswrapper[4835]: E0319 10:00:36.795555 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="extract-utilities" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795564 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="extract-utilities" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795844 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" containerName="registry-server" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.795879 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16f3bd5-80dd-4532-a60e-0a6d214ef185" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.797184 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.800214 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.800246 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.800375 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.800446 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.807993 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t"] Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.930887 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.931038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w7q2\" (UniqueName: \"kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:36 crc kubenswrapper[4835]: I0319 10:00:36.931255 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.033625 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.034042 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w7q2\" (UniqueName: \"kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.034093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.040027 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.043078 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.051361 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w7q2\" (UniqueName: \"kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.142927 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.699685 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" exitCode=0 Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.699797 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f"} Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.699895 4835 scope.go:117] "RemoveContainer" containerID="15f1a1995047c13b72b04c289a369693dabb319e01617eb74f1315b023c381ad" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.700825 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:00:37 crc kubenswrapper[4835]: E0319 10:00:37.701275 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:00:37 crc kubenswrapper[4835]: I0319 10:00:37.798282 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t"] Mar 19 10:00:38 crc kubenswrapper[4835]: I0319 10:00:38.428945 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb941ce-1892-446a-b619-7d04b2e962a4" path="/var/lib/kubelet/pods/7eb941ce-1892-446a-b619-7d04b2e962a4/volumes" Mar 19 10:00:38 crc kubenswrapper[4835]: I0319 10:00:38.717107 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" event={"ID":"3d377e09-921d-4439-ad81-cd9b8230cf5e","Type":"ContainerStarted","Data":"a3b534cd165f5cd10f971ee844ee576d5618cbb0b13e08652f42651affdc72f7"} Mar 19 10:00:38 crc kubenswrapper[4835]: I0319 10:00:38.717176 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" event={"ID":"3d377e09-921d-4439-ad81-cd9b8230cf5e","Type":"ContainerStarted","Data":"4a6d1047226c0fa00db078097e31ec598edb4ee13853755a67263253a2bc99b9"} Mar 19 10:00:38 crc kubenswrapper[4835]: I0319 10:00:38.736856 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" podStartSLOduration=2.2920445640000002 podStartE2EDuration="2.736836976s" podCreationTimestamp="2026-03-19 10:00:36 +0000 UTC" firstStartedPulling="2026-03-19 10:00:37.807497042 +0000 UTC m=+2292.656095629" lastFinishedPulling="2026-03-19 10:00:38.252289464 +0000 UTC m=+2293.100888041" observedRunningTime="2026-03-19 10:00:38.733813543 +0000 UTC m=+2293.582412130" watchObservedRunningTime="2026-03-19 10:00:38.736836976 +0000 UTC m=+2293.585435563" Mar 19 10:00:51 crc kubenswrapper[4835]: I0319 10:00:51.401596 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:00:51 crc kubenswrapper[4835]: E0319 10:00:51.402410 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.471290 4835 scope.go:117] "RemoveContainer" containerID="9e9893f0a835674377b4ba88abc28010f6657ae93eb955efbc6dc9dcaed5d4a1" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.499462 4835 scope.go:117] "RemoveContainer" containerID="b273eb566783dbb8ddac7dc0d86f30d341fc5479f6e0d007c37b2fe663cf1f0c" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.611479 4835 scope.go:117] "RemoveContainer" containerID="1a4e6f4481934c5abbe4f3763f236ac4089c1257cdc66bfb59cce1c8019e5592" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.652715 4835 scope.go:117] "RemoveContainer" containerID="91455d98d539f0429600b1823978388be42ffdc40431d43259b6e10024ca66be" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.730676 4835 scope.go:117] "RemoveContainer" containerID="858dcd9b670da439092c4843aa4aeb2a939b0a8b6bb3610e20716707f4f54aa4" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.774578 4835 scope.go:117] "RemoveContainer" containerID="1dea12df495f57d14942b518f9ebba5090492ec871638a1d3188f7382ae3ed8b" Mar 19 10:00:55 crc kubenswrapper[4835]: I0319 10:00:55.841277 4835 scope.go:117] "RemoveContainer" containerID="366383eb1a06aca62f7cd3ea3b67cd8644a5c6f7c74898e63827b9037e1e119a" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.150491 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565241-dx4db"] Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.152939 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.171654 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565241-dx4db"] Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.339249 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.339322 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.339456 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.339488 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqsfh\" (UniqueName: \"kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.441785 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.442152 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.442367 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.442476 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqsfh\" (UniqueName: \"kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.449109 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.450177 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.450564 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.460430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqsfh\" (UniqueName: \"kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh\") pod \"keystone-cron-29565241-dx4db\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.476864 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.851941 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565241-dx4db"] Mar 19 10:01:00 crc kubenswrapper[4835]: I0319 10:01:00.986169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-dx4db" event={"ID":"77cc85ba-0a70-4b9e-a5b5-23dc9681723a","Type":"ContainerStarted","Data":"b0140c311c156dd91166ae4aebace70a8b21957c674998c73fbc387ff85abe79"} Mar 19 10:01:01 crc kubenswrapper[4835]: I0319 10:01:01.998886 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-dx4db" event={"ID":"77cc85ba-0a70-4b9e-a5b5-23dc9681723a","Type":"ContainerStarted","Data":"1b030d395556a176fc9b04d0afd40a8650e438ea35d688a57764148c7b20f5e8"} Mar 19 10:01:02 crc kubenswrapper[4835]: I0319 10:01:02.024933 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565241-dx4db" podStartSLOduration=2.024911535 podStartE2EDuration="2.024911535s" podCreationTimestamp="2026-03-19 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:01:02.01813167 +0000 UTC m=+2316.866730277" watchObservedRunningTime="2026-03-19 10:01:02.024911535 +0000 UTC m=+2316.873510152" Mar 19 10:01:03 crc kubenswrapper[4835]: I0319 10:01:03.402041 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:01:03 crc kubenswrapper[4835]: E0319 10:01:03.402637 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:01:06 crc kubenswrapper[4835]: I0319 10:01:06.039213 4835 generic.go:334] "Generic (PLEG): container finished" podID="77cc85ba-0a70-4b9e-a5b5-23dc9681723a" containerID="1b030d395556a176fc9b04d0afd40a8650e438ea35d688a57764148c7b20f5e8" exitCode=0 Mar 19 10:01:06 crc kubenswrapper[4835]: I0319 10:01:06.039316 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-dx4db" event={"ID":"77cc85ba-0a70-4b9e-a5b5-23dc9681723a","Type":"ContainerDied","Data":"1b030d395556a176fc9b04d0afd40a8650e438ea35d688a57764148c7b20f5e8"} Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.430656 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.538817 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle\") pod \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.539097 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data\") pod \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.539196 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys\") pod \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.539248 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqsfh\" (UniqueName: \"kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh\") pod \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\" (UID: \"77cc85ba-0a70-4b9e-a5b5-23dc9681723a\") " Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.545324 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "77cc85ba-0a70-4b9e-a5b5-23dc9681723a" (UID: "77cc85ba-0a70-4b9e-a5b5-23dc9681723a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.545350 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh" (OuterVolumeSpecName: "kube-api-access-rqsfh") pod "77cc85ba-0a70-4b9e-a5b5-23dc9681723a" (UID: "77cc85ba-0a70-4b9e-a5b5-23dc9681723a"). InnerVolumeSpecName "kube-api-access-rqsfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.577397 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77cc85ba-0a70-4b9e-a5b5-23dc9681723a" (UID: "77cc85ba-0a70-4b9e-a5b5-23dc9681723a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.606996 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data" (OuterVolumeSpecName: "config-data") pod "77cc85ba-0a70-4b9e-a5b5-23dc9681723a" (UID: "77cc85ba-0a70-4b9e-a5b5-23dc9681723a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.643384 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.643478 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqsfh\" (UniqueName: \"kubernetes.io/projected/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-kube-api-access-rqsfh\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.643495 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:07 crc kubenswrapper[4835]: I0319 10:01:07.643510 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77cc85ba-0a70-4b9e-a5b5-23dc9681723a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.053700 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bv9wk"] Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.072393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565241-dx4db" event={"ID":"77cc85ba-0a70-4b9e-a5b5-23dc9681723a","Type":"ContainerDied","Data":"b0140c311c156dd91166ae4aebace70a8b21957c674998c73fbc387ff85abe79"} Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.072438 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0140c311c156dd91166ae4aebace70a8b21957c674998c73fbc387ff85abe79" Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.072504 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565241-dx4db" Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.078948 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bv9wk"] Mar 19 10:01:08 crc kubenswrapper[4835]: I0319 10:01:08.416839 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0149d19-3745-411a-94de-c66590906efb" path="/var/lib/kubelet/pods/d0149d19-3745-411a-94de-c66590906efb/volumes" Mar 19 10:01:16 crc kubenswrapper[4835]: I0319 10:01:16.413544 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:01:16 crc kubenswrapper[4835]: E0319 10:01:16.414562 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:01:23 crc kubenswrapper[4835]: I0319 10:01:23.247394 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d377e09-921d-4439-ad81-cd9b8230cf5e" containerID="a3b534cd165f5cd10f971ee844ee576d5618cbb0b13e08652f42651affdc72f7" exitCode=0 Mar 19 10:01:23 crc kubenswrapper[4835]: I0319 10:01:23.247567 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" event={"ID":"3d377e09-921d-4439-ad81-cd9b8230cf5e","Type":"ContainerDied","Data":"a3b534cd165f5cd10f971ee844ee576d5618cbb0b13e08652f42651affdc72f7"} Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.724105 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.895136 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w7q2\" (UniqueName: \"kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2\") pod \"3d377e09-921d-4439-ad81-cd9b8230cf5e\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.895386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory\") pod \"3d377e09-921d-4439-ad81-cd9b8230cf5e\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.895441 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam\") pod \"3d377e09-921d-4439-ad81-cd9b8230cf5e\" (UID: \"3d377e09-921d-4439-ad81-cd9b8230cf5e\") " Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.900760 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2" (OuterVolumeSpecName: "kube-api-access-2w7q2") pod "3d377e09-921d-4439-ad81-cd9b8230cf5e" (UID: "3d377e09-921d-4439-ad81-cd9b8230cf5e"). InnerVolumeSpecName "kube-api-access-2w7q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.933341 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d377e09-921d-4439-ad81-cd9b8230cf5e" (UID: "3d377e09-921d-4439-ad81-cd9b8230cf5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.939846 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory" (OuterVolumeSpecName: "inventory") pod "3d377e09-921d-4439-ad81-cd9b8230cf5e" (UID: "3d377e09-921d-4439-ad81-cd9b8230cf5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.998067 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.998104 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d377e09-921d-4439-ad81-cd9b8230cf5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:24 crc kubenswrapper[4835]: I0319 10:01:24.998114 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w7q2\" (UniqueName: \"kubernetes.io/projected/3d377e09-921d-4439-ad81-cd9b8230cf5e-kube-api-access-2w7q2\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.268693 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" event={"ID":"3d377e09-921d-4439-ad81-cd9b8230cf5e","Type":"ContainerDied","Data":"4a6d1047226c0fa00db078097e31ec598edb4ee13853755a67263253a2bc99b9"} Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.268728 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a6d1047226c0fa00db078097e31ec598edb4ee13853755a67263253a2bc99b9" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.268731 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.364189 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4j28"] Mar 19 10:01:25 crc kubenswrapper[4835]: E0319 10:01:25.365185 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cc85ba-0a70-4b9e-a5b5-23dc9681723a" containerName="keystone-cron" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.365234 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cc85ba-0a70-4b9e-a5b5-23dc9681723a" containerName="keystone-cron" Mar 19 10:01:25 crc kubenswrapper[4835]: E0319 10:01:25.365286 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d377e09-921d-4439-ad81-cd9b8230cf5e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.365302 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d377e09-921d-4439-ad81-cd9b8230cf5e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.365819 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cc85ba-0a70-4b9e-a5b5-23dc9681723a" containerName="keystone-cron" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.365851 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d377e09-921d-4439-ad81-cd9b8230cf5e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.367506 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.370045 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.370045 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.370495 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.375332 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.378415 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4j28"] Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.509288 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgh7n\" (UniqueName: \"kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.509422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.509942 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.612576 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgh7n\" (UniqueName: \"kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.612690 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.612842 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.617794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.617979 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.630320 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgh7n\" (UniqueName: \"kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n\") pod \"ssh-known-hosts-edpm-deployment-p4j28\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:25 crc kubenswrapper[4835]: I0319 10:01:25.686570 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:26 crc kubenswrapper[4835]: I0319 10:01:26.231708 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4j28"] Mar 19 10:01:26 crc kubenswrapper[4835]: I0319 10:01:26.280502 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" event={"ID":"7481c32a-7096-4407-9b2e-f6c715ef33f6","Type":"ContainerStarted","Data":"93d3c826e7f3cec6101d281bb1b74f3580c131d45e20998f3263754d045f8f83"} Mar 19 10:01:26 crc kubenswrapper[4835]: I0319 10:01:26.808801 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:01:27 crc kubenswrapper[4835]: I0319 10:01:27.295597 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" event={"ID":"7481c32a-7096-4407-9b2e-f6c715ef33f6","Type":"ContainerStarted","Data":"d6d4f04b89de68933f56044e53b26cd29a664b7efb767bd60561b298de6c037d"} Mar 19 10:01:27 crc kubenswrapper[4835]: I0319 10:01:27.334106 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" podStartSLOduration=1.7683294470000002 podStartE2EDuration="2.334083556s" podCreationTimestamp="2026-03-19 10:01:25 +0000 UTC" firstStartedPulling="2026-03-19 10:01:26.240340764 +0000 UTC m=+2341.088939351" lastFinishedPulling="2026-03-19 10:01:26.806094873 +0000 UTC m=+2341.654693460" observedRunningTime="2026-03-19 10:01:27.319875947 +0000 UTC m=+2342.168474544" watchObservedRunningTime="2026-03-19 10:01:27.334083556 +0000 UTC m=+2342.182682143" Mar 19 10:01:28 crc kubenswrapper[4835]: I0319 10:01:28.402159 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:01:28 crc kubenswrapper[4835]: E0319 10:01:28.402480 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:01:34 crc kubenswrapper[4835]: I0319 10:01:34.393580 4835 generic.go:334] "Generic (PLEG): container finished" podID="7481c32a-7096-4407-9b2e-f6c715ef33f6" containerID="d6d4f04b89de68933f56044e53b26cd29a664b7efb767bd60561b298de6c037d" exitCode=0 Mar 19 10:01:34 crc kubenswrapper[4835]: I0319 10:01:34.393607 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" event={"ID":"7481c32a-7096-4407-9b2e-f6c715ef33f6","Type":"ContainerDied","Data":"d6d4f04b89de68933f56044e53b26cd29a664b7efb767bd60561b298de6c037d"} Mar 19 10:01:35 crc kubenswrapper[4835]: I0319 10:01:35.875643 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.016112 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgh7n\" (UniqueName: \"kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n\") pod \"7481c32a-7096-4407-9b2e-f6c715ef33f6\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.016211 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0\") pod \"7481c32a-7096-4407-9b2e-f6c715ef33f6\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.016485 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam\") pod \"7481c32a-7096-4407-9b2e-f6c715ef33f6\" (UID: \"7481c32a-7096-4407-9b2e-f6c715ef33f6\") " Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.025038 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n" (OuterVolumeSpecName: "kube-api-access-pgh7n") pod "7481c32a-7096-4407-9b2e-f6c715ef33f6" (UID: "7481c32a-7096-4407-9b2e-f6c715ef33f6"). InnerVolumeSpecName "kube-api-access-pgh7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.048955 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7481c32a-7096-4407-9b2e-f6c715ef33f6" (UID: "7481c32a-7096-4407-9b2e-f6c715ef33f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.055324 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7481c32a-7096-4407-9b2e-f6c715ef33f6" (UID: "7481c32a-7096-4407-9b2e-f6c715ef33f6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.119396 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.119438 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgh7n\" (UniqueName: \"kubernetes.io/projected/7481c32a-7096-4407-9b2e-f6c715ef33f6-kube-api-access-pgh7n\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.119454 4835 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7481c32a-7096-4407-9b2e-f6c715ef33f6-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.419821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" event={"ID":"7481c32a-7096-4407-9b2e-f6c715ef33f6","Type":"ContainerDied","Data":"93d3c826e7f3cec6101d281bb1b74f3580c131d45e20998f3263754d045f8f83"} Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.419860 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d3c826e7f3cec6101d281bb1b74f3580c131d45e20998f3263754d045f8f83" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.419921 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4j28" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.485434 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp"] Mar 19 10:01:36 crc kubenswrapper[4835]: E0319 10:01:36.486290 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7481c32a-7096-4407-9b2e-f6c715ef33f6" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.486317 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7481c32a-7096-4407-9b2e-f6c715ef33f6" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.486575 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7481c32a-7096-4407-9b2e-f6c715ef33f6" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.487834 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.499756 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp"] Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.512462 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.512697 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.513128 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.513149 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.629963 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.630008 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.630181 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sbv8\" (UniqueName: \"kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.732150 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sbv8\" (UniqueName: \"kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.732284 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.732309 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.736866 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.739285 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.749812 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sbv8\" (UniqueName: \"kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jpgpp\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:36 crc kubenswrapper[4835]: I0319 10:01:36.832540 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:37 crc kubenswrapper[4835]: I0319 10:01:37.364394 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp"] Mar 19 10:01:37 crc kubenswrapper[4835]: I0319 10:01:37.431585 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" event={"ID":"29e7446f-a78e-434c-a1e3-3046f98fad69","Type":"ContainerStarted","Data":"43d25281fa11a82c80f7a2578736c8f5e4540be209b653c508d8f90ef2854b90"} Mar 19 10:01:38 crc kubenswrapper[4835]: I0319 10:01:38.443932 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" event={"ID":"29e7446f-a78e-434c-a1e3-3046f98fad69","Type":"ContainerStarted","Data":"4c6baea7360a87a6d1bb33d2283815acd00e22033565867bc3992d74372d3658"} Mar 19 10:01:38 crc kubenswrapper[4835]: I0319 10:01:38.485256 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" podStartSLOduration=1.919044205 podStartE2EDuration="2.485233165s" podCreationTimestamp="2026-03-19 10:01:36 +0000 UTC" firstStartedPulling="2026-03-19 10:01:37.373513921 +0000 UTC m=+2352.222112508" lastFinishedPulling="2026-03-19 10:01:37.939702881 +0000 UTC m=+2352.788301468" observedRunningTime="2026-03-19 10:01:38.476391212 +0000 UTC m=+2353.324989829" watchObservedRunningTime="2026-03-19 10:01:38.485233165 +0000 UTC m=+2353.333831752" Mar 19 10:01:39 crc kubenswrapper[4835]: I0319 10:01:39.402631 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:01:39 crc kubenswrapper[4835]: E0319 10:01:39.403394 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:01:46 crc kubenswrapper[4835]: I0319 10:01:46.526699 4835 generic.go:334] "Generic (PLEG): container finished" podID="29e7446f-a78e-434c-a1e3-3046f98fad69" containerID="4c6baea7360a87a6d1bb33d2283815acd00e22033565867bc3992d74372d3658" exitCode=0 Mar 19 10:01:46 crc kubenswrapper[4835]: I0319 10:01:46.526784 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" event={"ID":"29e7446f-a78e-434c-a1e3-3046f98fad69","Type":"ContainerDied","Data":"4c6baea7360a87a6d1bb33d2283815acd00e22033565867bc3992d74372d3658"} Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.012363 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.060532 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory\") pod \"29e7446f-a78e-434c-a1e3-3046f98fad69\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.060697 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sbv8\" (UniqueName: \"kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8\") pod \"29e7446f-a78e-434c-a1e3-3046f98fad69\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.060835 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam\") pod \"29e7446f-a78e-434c-a1e3-3046f98fad69\" (UID: \"29e7446f-a78e-434c-a1e3-3046f98fad69\") " Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.068091 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8" (OuterVolumeSpecName: "kube-api-access-8sbv8") pod "29e7446f-a78e-434c-a1e3-3046f98fad69" (UID: "29e7446f-a78e-434c-a1e3-3046f98fad69"). InnerVolumeSpecName "kube-api-access-8sbv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.092409 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "29e7446f-a78e-434c-a1e3-3046f98fad69" (UID: "29e7446f-a78e-434c-a1e3-3046f98fad69"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.094185 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory" (OuterVolumeSpecName: "inventory") pod "29e7446f-a78e-434c-a1e3-3046f98fad69" (UID: "29e7446f-a78e-434c-a1e3-3046f98fad69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.165224 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.165267 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sbv8\" (UniqueName: \"kubernetes.io/projected/29e7446f-a78e-434c-a1e3-3046f98fad69-kube-api-access-8sbv8\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.165281 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/29e7446f-a78e-434c-a1e3-3046f98fad69-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.547169 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" event={"ID":"29e7446f-a78e-434c-a1e3-3046f98fad69","Type":"ContainerDied","Data":"43d25281fa11a82c80f7a2578736c8f5e4540be209b653c508d8f90ef2854b90"} Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.547501 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d25281fa11a82c80f7a2578736c8f5e4540be209b653c508d8f90ef2854b90" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.547221 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jpgpp" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.641453 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5"] Mar 19 10:01:48 crc kubenswrapper[4835]: E0319 10:01:48.642066 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e7446f-a78e-434c-a1e3-3046f98fad69" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.642086 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e7446f-a78e-434c-a1e3-3046f98fad69" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.642346 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e7446f-a78e-434c-a1e3-3046f98fad69" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.643261 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.645249 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.645365 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.645440 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.645627 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.663570 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5"] Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.677364 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj7ss\" (UniqueName: \"kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.677422 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.677971 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.780308 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj7ss\" (UniqueName: \"kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.780408 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.781507 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.786218 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.796394 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj7ss\" (UniqueName: \"kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.807385 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:48 crc kubenswrapper[4835]: I0319 10:01:48.964858 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:01:49 crc kubenswrapper[4835]: I0319 10:01:49.518075 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5"] Mar 19 10:01:49 crc kubenswrapper[4835]: I0319 10:01:49.559336 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" event={"ID":"0e7e2b75-9097-496a-b7ba-be0340826564","Type":"ContainerStarted","Data":"b5f6ae629cb77aa86cf20f1a71a12c0faf9ac6430a66c54909428ff179958e33"} Mar 19 10:01:50 crc kubenswrapper[4835]: I0319 10:01:50.572124 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" event={"ID":"0e7e2b75-9097-496a-b7ba-be0340826564","Type":"ContainerStarted","Data":"1b74bc558a26c1e996d5c8169643ff0ae4571eab7b2e761196f052fbd8ede8dd"} Mar 19 10:01:52 crc kubenswrapper[4835]: I0319 10:01:52.402248 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:01:52 crc kubenswrapper[4835]: E0319 10:01:52.403410 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:01:56 crc kubenswrapper[4835]: I0319 10:01:56.054940 4835 scope.go:117] "RemoveContainer" containerID="f3005d3fa14e778be60ab66e97afec5229741a9a4db633769ac228a4a9b32318" Mar 19 10:01:59 crc kubenswrapper[4835]: I0319 10:01:59.670043 4835 generic.go:334] "Generic (PLEG): container finished" podID="0e7e2b75-9097-496a-b7ba-be0340826564" containerID="1b74bc558a26c1e996d5c8169643ff0ae4571eab7b2e761196f052fbd8ede8dd" exitCode=0 Mar 19 10:01:59 crc kubenswrapper[4835]: I0319 10:01:59.670138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" event={"ID":"0e7e2b75-9097-496a-b7ba-be0340826564","Type":"ContainerDied","Data":"1b74bc558a26c1e996d5c8169643ff0ae4571eab7b2e761196f052fbd8ede8dd"} Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.132726 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565242-58qhn"] Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.134307 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.136552 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.136803 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.141155 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.146099 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565242-58qhn"] Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.268160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwsd\" (UniqueName: \"kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd\") pod \"auto-csr-approver-29565242-58qhn\" (UID: \"0c6df688-8714-4569-90ee-ef7e8c1d2a8c\") " pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.371049 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwsd\" (UniqueName: \"kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd\") pod \"auto-csr-approver-29565242-58qhn\" (UID: \"0c6df688-8714-4569-90ee-ef7e8c1d2a8c\") " pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.390702 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwsd\" (UniqueName: \"kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd\") pod \"auto-csr-approver-29565242-58qhn\" (UID: \"0c6df688-8714-4569-90ee-ef7e8c1d2a8c\") " pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.455396 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:00 crc kubenswrapper[4835]: I0319 10:02:00.887810 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565242-58qhn"] Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.086276 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.193133 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory\") pod \"0e7e2b75-9097-496a-b7ba-be0340826564\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.193324 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj7ss\" (UniqueName: \"kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss\") pod \"0e7e2b75-9097-496a-b7ba-be0340826564\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.193790 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam\") pod \"0e7e2b75-9097-496a-b7ba-be0340826564\" (UID: \"0e7e2b75-9097-496a-b7ba-be0340826564\") " Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.199576 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss" (OuterVolumeSpecName: "kube-api-access-tj7ss") pod "0e7e2b75-9097-496a-b7ba-be0340826564" (UID: "0e7e2b75-9097-496a-b7ba-be0340826564"). InnerVolumeSpecName "kube-api-access-tj7ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.236497 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory" (OuterVolumeSpecName: "inventory") pod "0e7e2b75-9097-496a-b7ba-be0340826564" (UID: "0e7e2b75-9097-496a-b7ba-be0340826564"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.254262 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0e7e2b75-9097-496a-b7ba-be0340826564" (UID: "0e7e2b75-9097-496a-b7ba-be0340826564"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.297811 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj7ss\" (UniqueName: \"kubernetes.io/projected/0e7e2b75-9097-496a-b7ba-be0340826564-kube-api-access-tj7ss\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.297864 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.297882 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0e7e2b75-9097-496a-b7ba-be0340826564-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.693232 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565242-58qhn" event={"ID":"0c6df688-8714-4569-90ee-ef7e8c1d2a8c","Type":"ContainerStarted","Data":"4bd36e47af01cb43de409a19c6153ce487a350ba75f97d35a9d4468ab233a533"} Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.696300 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" event={"ID":"0e7e2b75-9097-496a-b7ba-be0340826564","Type":"ContainerDied","Data":"b5f6ae629cb77aa86cf20f1a71a12c0faf9ac6430a66c54909428ff179958e33"} Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.696347 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.696354 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f6ae629cb77aa86cf20f1a71a12c0faf9ac6430a66c54909428ff179958e33" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.780731 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5"] Mar 19 10:02:01 crc kubenswrapper[4835]: E0319 10:02:01.787850 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7e2b75-9097-496a-b7ba-be0340826564" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.788001 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7e2b75-9097-496a-b7ba-be0340826564" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.788472 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7e2b75-9097-496a-b7ba-be0340826564" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.789800 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.796359 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.796719 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.796889 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797092 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797157 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797197 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797417 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797426 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.797454 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.812521 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5"] Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913461 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9k4f\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913779 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913856 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913884 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913905 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913966 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.913996 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914023 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914044 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914090 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914119 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914160 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914193 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914254 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:01 crc kubenswrapper[4835]: I0319 10:02:01.914286 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016113 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016163 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016236 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016271 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016321 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9k4f\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016338 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016396 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016439 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016467 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016491 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016512 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016533 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016572 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.016615 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.023207 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.023403 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.025563 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.026356 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.027079 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.027502 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.031192 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.032798 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.032897 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.033071 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.033737 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9k4f\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.035647 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.037703 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.038407 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.038982 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.048418 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.111601 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.699592 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5"] Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.711939 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c6df688-8714-4569-90ee-ef7e8c1d2a8c" containerID="2fc30a46820a5b741aa124b55e9f2c6fedecf7ea5538addeecdffb0cb1a92882" exitCode=0 Mar 19 10:02:02 crc kubenswrapper[4835]: I0319 10:02:02.711987 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565242-58qhn" event={"ID":"0c6df688-8714-4569-90ee-ef7e8c1d2a8c","Type":"ContainerDied","Data":"2fc30a46820a5b741aa124b55e9f2c6fedecf7ea5538addeecdffb0cb1a92882"} Mar 19 10:02:03 crc kubenswrapper[4835]: I0319 10:02:03.726823 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" event={"ID":"efe0b48c-18f7-47aa-b470-5bd2fa11fb84","Type":"ContainerStarted","Data":"51fc9c24d5b4d4769e0d9afe0663b139f31bb6db7df299abb7dccb4327b196be"} Mar 19 10:02:03 crc kubenswrapper[4835]: I0319 10:02:03.728195 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" event={"ID":"efe0b48c-18f7-47aa-b470-5bd2fa11fb84","Type":"ContainerStarted","Data":"d6c5a4e15b0c897c4de24a2b1a8d8cdee3224159e82fccbd67cdfd79af6a9175"} Mar 19 10:02:03 crc kubenswrapper[4835]: I0319 10:02:03.749213 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" podStartSLOduration=2.185186137 podStartE2EDuration="2.749195237s" podCreationTimestamp="2026-03-19 10:02:01 +0000 UTC" firstStartedPulling="2026-03-19 10:02:02.701333683 +0000 UTC m=+2377.549932270" lastFinishedPulling="2026-03-19 10:02:03.265342793 +0000 UTC m=+2378.113941370" observedRunningTime="2026-03-19 10:02:03.747540451 +0000 UTC m=+2378.596139048" watchObservedRunningTime="2026-03-19 10:02:03.749195237 +0000 UTC m=+2378.597793824" Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.173306 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.275857 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcwsd\" (UniqueName: \"kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd\") pod \"0c6df688-8714-4569-90ee-ef7e8c1d2a8c\" (UID: \"0c6df688-8714-4569-90ee-ef7e8c1d2a8c\") " Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.284017 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd" (OuterVolumeSpecName: "kube-api-access-lcwsd") pod "0c6df688-8714-4569-90ee-ef7e8c1d2a8c" (UID: "0c6df688-8714-4569-90ee-ef7e8c1d2a8c"). InnerVolumeSpecName "kube-api-access-lcwsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.379267 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcwsd\" (UniqueName: \"kubernetes.io/projected/0c6df688-8714-4569-90ee-ef7e8c1d2a8c-kube-api-access-lcwsd\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.737893 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565242-58qhn" Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.737909 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565242-58qhn" event={"ID":"0c6df688-8714-4569-90ee-ef7e8c1d2a8c","Type":"ContainerDied","Data":"4bd36e47af01cb43de409a19c6153ce487a350ba75f97d35a9d4468ab233a533"} Mar 19 10:02:04 crc kubenswrapper[4835]: I0319 10:02:04.738955 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd36e47af01cb43de409a19c6153ce487a350ba75f97d35a9d4468ab233a533" Mar 19 10:02:05 crc kubenswrapper[4835]: I0319 10:02:05.264689 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565236-44rxp"] Mar 19 10:02:05 crc kubenswrapper[4835]: I0319 10:02:05.274319 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565236-44rxp"] Mar 19 10:02:05 crc kubenswrapper[4835]: I0319 10:02:05.402902 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:02:05 crc kubenswrapper[4835]: E0319 10:02:05.403231 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:02:06 crc kubenswrapper[4835]: I0319 10:02:06.416844 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e43d35a6-eced-44d9-839a-f4d15941e4c2" path="/var/lib/kubelet/pods/e43d35a6-eced-44d9-839a-f4d15941e4c2/volumes" Mar 19 10:02:19 crc kubenswrapper[4835]: I0319 10:02:19.406300 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:02:19 crc kubenswrapper[4835]: E0319 10:02:19.407572 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:02:33 crc kubenswrapper[4835]: I0319 10:02:33.402427 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:02:33 crc kubenswrapper[4835]: E0319 10:02:33.403169 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:02:42 crc kubenswrapper[4835]: I0319 10:02:42.169188 4835 generic.go:334] "Generic (PLEG): container finished" podID="efe0b48c-18f7-47aa-b470-5bd2fa11fb84" containerID="51fc9c24d5b4d4769e0d9afe0663b139f31bb6db7df299abb7dccb4327b196be" exitCode=0 Mar 19 10:02:42 crc kubenswrapper[4835]: I0319 10:02:42.169283 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" event={"ID":"efe0b48c-18f7-47aa-b470-5bd2fa11fb84","Type":"ContainerDied","Data":"51fc9c24d5b4d4769e0d9afe0663b139f31bb6db7df299abb7dccb4327b196be"} Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.649692 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.789834 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.789971 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790066 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790098 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790147 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790230 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790304 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790386 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790429 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790451 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9k4f\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790541 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790575 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.790604 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\" (UID: \"efe0b48c-18f7-47aa-b470-5bd2fa11fb84\") " Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.796864 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.798649 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.799497 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.800376 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.802571 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.803700 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.803732 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.803809 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.803852 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.804240 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f" (OuterVolumeSpecName: "kube-api-access-p9k4f") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "kube-api-access-p9k4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.804254 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.805311 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.805527 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.806131 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.837478 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.843886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory" (OuterVolumeSpecName: "inventory") pod "efe0b48c-18f7-47aa-b470-5bd2fa11fb84" (UID: "efe0b48c-18f7-47aa-b470-5bd2fa11fb84"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.893689 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894041 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894059 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9k4f\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-kube-api-access-p9k4f\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894071 4835 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894087 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894101 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894113 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894129 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894143 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894156 4835 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894172 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894184 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894198 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894210 4835 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894223 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:43 crc kubenswrapper[4835]: I0319 10:02:43.894237 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efe0b48c-18f7-47aa-b470-5bd2fa11fb84-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.203099 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" event={"ID":"efe0b48c-18f7-47aa-b470-5bd2fa11fb84","Type":"ContainerDied","Data":"d6c5a4e15b0c897c4de24a2b1a8d8cdee3224159e82fccbd67cdfd79af6a9175"} Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.203172 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c5a4e15b0c897c4de24a2b1a8d8cdee3224159e82fccbd67cdfd79af6a9175" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.203257 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.329568 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g"] Mar 19 10:02:44 crc kubenswrapper[4835]: E0319 10:02:44.330093 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe0b48c-18f7-47aa-b470-5bd2fa11fb84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.330111 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe0b48c-18f7-47aa-b470-5bd2fa11fb84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:44 crc kubenswrapper[4835]: E0319 10:02:44.330150 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6df688-8714-4569-90ee-ef7e8c1d2a8c" containerName="oc" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.330156 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6df688-8714-4569-90ee-ef7e8c1d2a8c" containerName="oc" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.330391 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe0b48c-18f7-47aa-b470-5bd2fa11fb84" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.330413 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6df688-8714-4569-90ee-ef7e8c1d2a8c" containerName="oc" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.331296 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.334571 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.334574 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.335052 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.335311 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.337323 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.340281 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g"] Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.407323 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.408097 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.408209 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvfg6\" (UniqueName: \"kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.408257 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.408273 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.510870 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.510973 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvfg6\" (UniqueName: \"kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.511026 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.511078 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.512095 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.512189 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.516019 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.526929 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.528150 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.530566 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvfg6\" (UniqueName: \"kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-s2d4g\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:44 crc kubenswrapper[4835]: I0319 10:02:44.649169 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:02:45 crc kubenswrapper[4835]: I0319 10:02:45.229838 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g"] Mar 19 10:02:45 crc kubenswrapper[4835]: W0319 10:02:45.239194 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d96a831_67f3_468f_9c03_beeec7c529b4.slice/crio-b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a WatchSource:0}: Error finding container b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a: Status 404 returned error can't find the container with id b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a Mar 19 10:02:46 crc kubenswrapper[4835]: I0319 10:02:46.226445 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" event={"ID":"3d96a831-67f3-468f-9c03-beeec7c529b4","Type":"ContainerStarted","Data":"60e90255c86884d8e9ad71eb4bd00b68b7ec05f1c3116d9b16163be8514e8e89"} Mar 19 10:02:46 crc kubenswrapper[4835]: I0319 10:02:46.227028 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" event={"ID":"3d96a831-67f3-468f-9c03-beeec7c529b4","Type":"ContainerStarted","Data":"b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a"} Mar 19 10:02:46 crc kubenswrapper[4835]: I0319 10:02:46.247569 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" podStartSLOduration=1.768341033 podStartE2EDuration="2.247551068s" podCreationTimestamp="2026-03-19 10:02:44 +0000 UTC" firstStartedPulling="2026-03-19 10:02:45.24184844 +0000 UTC m=+2420.090447027" lastFinishedPulling="2026-03-19 10:02:45.721058475 +0000 UTC m=+2420.569657062" observedRunningTime="2026-03-19 10:02:46.239899247 +0000 UTC m=+2421.088497834" watchObservedRunningTime="2026-03-19 10:02:46.247551068 +0000 UTC m=+2421.096149655" Mar 19 10:02:47 crc kubenswrapper[4835]: I0319 10:02:47.402551 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:02:47 crc kubenswrapper[4835]: E0319 10:02:47.403149 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:02:56 crc kubenswrapper[4835]: I0319 10:02:56.243354 4835 scope.go:117] "RemoveContainer" containerID="0d7344c68f4154e07b8048c217b49a5a7ca84e057bfba57273d2af7b62d4928d" Mar 19 10:03:00 crc kubenswrapper[4835]: I0319 10:03:00.048125 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-ck565"] Mar 19 10:03:00 crc kubenswrapper[4835]: I0319 10:03:00.060303 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-ck565"] Mar 19 10:03:00 crc kubenswrapper[4835]: I0319 10:03:00.402502 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:00 crc kubenswrapper[4835]: E0319 10:03:00.403220 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:03:00 crc kubenswrapper[4835]: I0319 10:03:00.415970 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5db77c-86cf-4661-8fe4-be1b865884a2" path="/var/lib/kubelet/pods/9a5db77c-86cf-4661-8fe4-be1b865884a2/volumes" Mar 19 10:03:11 crc kubenswrapper[4835]: I0319 10:03:11.402558 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:11 crc kubenswrapper[4835]: E0319 10:03:11.403889 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:03:22 crc kubenswrapper[4835]: I0319 10:03:22.402809 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:22 crc kubenswrapper[4835]: E0319 10:03:22.403933 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:03:34 crc kubenswrapper[4835]: I0319 10:03:34.402811 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:34 crc kubenswrapper[4835]: E0319 10:03:34.404148 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:03:40 crc kubenswrapper[4835]: I0319 10:03:40.886425 4835 generic.go:334] "Generic (PLEG): container finished" podID="3d96a831-67f3-468f-9c03-beeec7c529b4" containerID="60e90255c86884d8e9ad71eb4bd00b68b7ec05f1c3116d9b16163be8514e8e89" exitCode=0 Mar 19 10:03:40 crc kubenswrapper[4835]: I0319 10:03:40.886536 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" event={"ID":"3d96a831-67f3-468f-9c03-beeec7c529b4","Type":"ContainerDied","Data":"60e90255c86884d8e9ad71eb4bd00b68b7ec05f1c3116d9b16163be8514e8e89"} Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.425564 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.580900 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam\") pod \"3d96a831-67f3-468f-9c03-beeec7c529b4\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.581024 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0\") pod \"3d96a831-67f3-468f-9c03-beeec7c529b4\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.581481 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvfg6\" (UniqueName: \"kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6\") pod \"3d96a831-67f3-468f-9c03-beeec7c529b4\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.581540 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle\") pod \"3d96a831-67f3-468f-9c03-beeec7c529b4\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.581593 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory\") pod \"3d96a831-67f3-468f-9c03-beeec7c529b4\" (UID: \"3d96a831-67f3-468f-9c03-beeec7c529b4\") " Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.587486 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6" (OuterVolumeSpecName: "kube-api-access-dvfg6") pod "3d96a831-67f3-468f-9c03-beeec7c529b4" (UID: "3d96a831-67f3-468f-9c03-beeec7c529b4"). InnerVolumeSpecName "kube-api-access-dvfg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.593110 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3d96a831-67f3-468f-9c03-beeec7c529b4" (UID: "3d96a831-67f3-468f-9c03-beeec7c529b4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.615595 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory" (OuterVolumeSpecName: "inventory") pod "3d96a831-67f3-468f-9c03-beeec7c529b4" (UID: "3d96a831-67f3-468f-9c03-beeec7c529b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.616301 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3d96a831-67f3-468f-9c03-beeec7c529b4" (UID: "3d96a831-67f3-468f-9c03-beeec7c529b4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.621345 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d96a831-67f3-468f-9c03-beeec7c529b4" (UID: "3d96a831-67f3-468f-9c03-beeec7c529b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.685698 4835 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.685755 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.685768 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d96a831-67f3-468f-9c03-beeec7c529b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.685777 4835 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3d96a831-67f3-468f-9c03-beeec7c529b4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.685785 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvfg6\" (UniqueName: \"kubernetes.io/projected/3d96a831-67f3-468f-9c03-beeec7c529b4-kube-api-access-dvfg6\") on node \"crc\" DevicePath \"\"" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.912200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" event={"ID":"3d96a831-67f3-468f-9c03-beeec7c529b4","Type":"ContainerDied","Data":"b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a"} Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.912251 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b16c5600f6b79935f1ca50d6e54d3fe94b57350ad23cbb0c1fbac678194c374a" Mar 19 10:03:42 crc kubenswrapper[4835]: I0319 10:03:42.912262 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-s2d4g" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.008220 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn"] Mar 19 10:03:43 crc kubenswrapper[4835]: E0319 10:03:43.009102 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d96a831-67f3-468f-9c03-beeec7c529b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.009186 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d96a831-67f3-468f-9c03-beeec7c529b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.009492 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d96a831-67f3-468f-9c03-beeec7c529b4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.011509 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.015375 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.015527 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.015595 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.015841 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.016005 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.016124 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.038215 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn"] Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.074025 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-tp7ss"] Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.094447 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-tp7ss"] Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.197801 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.197965 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.198016 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.198210 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.198260 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgphq\" (UniqueName: \"kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.198372 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300007 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300062 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300155 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300188 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgphq\" (UniqueName: \"kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.300324 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.305684 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.306153 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.306999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.314114 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.316670 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.317988 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgphq\" (UniqueName: \"kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.336036 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.885309 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn"] Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.898534 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:03:43 crc kubenswrapper[4835]: I0319 10:03:43.922684 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" event={"ID":"64fbb382-1146-4c37-ad0a-596ce57ae0f5","Type":"ContainerStarted","Data":"42beaa54abdddfd143ff8645cfd4a295443f069274728ef23888eb5fca8ca1d0"} Mar 19 10:03:44 crc kubenswrapper[4835]: I0319 10:03:44.417815 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d23476b0-3b28-4a6f-84ec-4fcd3b3463cb" path="/var/lib/kubelet/pods/d23476b0-3b28-4a6f-84ec-4fcd3b3463cb/volumes" Mar 19 10:03:44 crc kubenswrapper[4835]: I0319 10:03:44.935091 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" event={"ID":"64fbb382-1146-4c37-ad0a-596ce57ae0f5","Type":"ContainerStarted","Data":"c1d89963cf5ffebec6020ba6037028ebc961eee516e065205f9fd5364e6e52b9"} Mar 19 10:03:44 crc kubenswrapper[4835]: I0319 10:03:44.955653 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" podStartSLOduration=2.570133963 podStartE2EDuration="2.95563705s" podCreationTimestamp="2026-03-19 10:03:42 +0000 UTC" firstStartedPulling="2026-03-19 10:03:43.898300897 +0000 UTC m=+2478.746899484" lastFinishedPulling="2026-03-19 10:03:44.283803984 +0000 UTC m=+2479.132402571" observedRunningTime="2026-03-19 10:03:44.951325441 +0000 UTC m=+2479.799924028" watchObservedRunningTime="2026-03-19 10:03:44.95563705 +0000 UTC m=+2479.804235637" Mar 19 10:03:45 crc kubenswrapper[4835]: I0319 10:03:45.401648 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:45 crc kubenswrapper[4835]: E0319 10:03:45.402035 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:03:56 crc kubenswrapper[4835]: I0319 10:03:56.352684 4835 scope.go:117] "RemoveContainer" containerID="fb85c8e07fa0ec0e6ba5c6374a58cef371ebadc9572cfbc79add1e2cb829d836" Mar 19 10:03:56 crc kubenswrapper[4835]: I0319 10:03:56.401336 4835 scope.go:117] "RemoveContainer" containerID="67afa5d789dfed720e9368c56b5bd57a4ebd6e51db53c397188a4b44a0ee8b5b" Mar 19 10:03:56 crc kubenswrapper[4835]: I0319 10:03:56.412985 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:03:56 crc kubenswrapper[4835]: E0319 10:03:56.414305 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.133034 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565244-dbl8x"] Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.135611 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.139260 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.139519 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.140659 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.149893 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565244-dbl8x"] Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.225412 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstff\" (UniqueName: \"kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff\") pod \"auto-csr-approver-29565244-dbl8x\" (UID: \"c4ef0143-6ad8-4c19-80a9-6590e5aa8943\") " pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.328314 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tstff\" (UniqueName: \"kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff\") pod \"auto-csr-approver-29565244-dbl8x\" (UID: \"c4ef0143-6ad8-4c19-80a9-6590e5aa8943\") " pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.351545 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstff\" (UniqueName: \"kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff\") pod \"auto-csr-approver-29565244-dbl8x\" (UID: \"c4ef0143-6ad8-4c19-80a9-6590e5aa8943\") " pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:00 crc kubenswrapper[4835]: I0319 10:04:00.539401 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:01 crc kubenswrapper[4835]: I0319 10:04:01.008579 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565244-dbl8x"] Mar 19 10:04:01 crc kubenswrapper[4835]: W0319 10:04:01.011695 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4ef0143_6ad8_4c19_80a9_6590e5aa8943.slice/crio-0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8 WatchSource:0}: Error finding container 0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8: Status 404 returned error can't find the container with id 0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8 Mar 19 10:04:01 crc kubenswrapper[4835]: I0319 10:04:01.148489 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" event={"ID":"c4ef0143-6ad8-4c19-80a9-6590e5aa8943","Type":"ContainerStarted","Data":"0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8"} Mar 19 10:04:02 crc kubenswrapper[4835]: I0319 10:04:02.159539 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" event={"ID":"c4ef0143-6ad8-4c19-80a9-6590e5aa8943","Type":"ContainerStarted","Data":"9c62d5d999c5b3fe94e59b31a99f6fed50cf1478bbda4cd28024838af8868935"} Mar 19 10:04:02 crc kubenswrapper[4835]: I0319 10:04:02.181837 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" podStartSLOduration=1.354022653 podStartE2EDuration="2.181814364s" podCreationTimestamp="2026-03-19 10:04:00 +0000 UTC" firstStartedPulling="2026-03-19 10:04:01.014345902 +0000 UTC m=+2495.862944489" lastFinishedPulling="2026-03-19 10:04:01.842137613 +0000 UTC m=+2496.690736200" observedRunningTime="2026-03-19 10:04:02.172233022 +0000 UTC m=+2497.020831609" watchObservedRunningTime="2026-03-19 10:04:02.181814364 +0000 UTC m=+2497.030412951" Mar 19 10:04:03 crc kubenswrapper[4835]: I0319 10:04:03.177493 4835 generic.go:334] "Generic (PLEG): container finished" podID="c4ef0143-6ad8-4c19-80a9-6590e5aa8943" containerID="9c62d5d999c5b3fe94e59b31a99f6fed50cf1478bbda4cd28024838af8868935" exitCode=0 Mar 19 10:04:03 crc kubenswrapper[4835]: I0319 10:04:03.177564 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" event={"ID":"c4ef0143-6ad8-4c19-80a9-6590e5aa8943","Type":"ContainerDied","Data":"9c62d5d999c5b3fe94e59b31a99f6fed50cf1478bbda4cd28024838af8868935"} Mar 19 10:04:04 crc kubenswrapper[4835]: I0319 10:04:04.692931 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:04 crc kubenswrapper[4835]: I0319 10:04:04.856441 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tstff\" (UniqueName: \"kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff\") pod \"c4ef0143-6ad8-4c19-80a9-6590e5aa8943\" (UID: \"c4ef0143-6ad8-4c19-80a9-6590e5aa8943\") " Mar 19 10:04:04 crc kubenswrapper[4835]: I0319 10:04:04.864025 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff" (OuterVolumeSpecName: "kube-api-access-tstff") pod "c4ef0143-6ad8-4c19-80a9-6590e5aa8943" (UID: "c4ef0143-6ad8-4c19-80a9-6590e5aa8943"). InnerVolumeSpecName "kube-api-access-tstff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:04:04 crc kubenswrapper[4835]: I0319 10:04:04.960505 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tstff\" (UniqueName: \"kubernetes.io/projected/c4ef0143-6ad8-4c19-80a9-6590e5aa8943-kube-api-access-tstff\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:05 crc kubenswrapper[4835]: I0319 10:04:05.205322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" event={"ID":"c4ef0143-6ad8-4c19-80a9-6590e5aa8943","Type":"ContainerDied","Data":"0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8"} Mar 19 10:04:05 crc kubenswrapper[4835]: I0319 10:04:05.205360 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3d1a4d97ed4bbd7e3e7f3864d31f71d7852a418a5b01911da360035c3858b8" Mar 19 10:04:05 crc kubenswrapper[4835]: I0319 10:04:05.205411 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565244-dbl8x" Mar 19 10:04:05 crc kubenswrapper[4835]: I0319 10:04:05.267224 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565238-5w6jr"] Mar 19 10:04:05 crc kubenswrapper[4835]: I0319 10:04:05.278564 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565238-5w6jr"] Mar 19 10:04:06 crc kubenswrapper[4835]: I0319 10:04:06.416664 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebb674c-a705-418a-92ea-771ab5d01a39" path="/var/lib/kubelet/pods/6ebb674c-a705-418a-92ea-771ab5d01a39/volumes" Mar 19 10:04:07 crc kubenswrapper[4835]: I0319 10:04:07.402399 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:04:07 crc kubenswrapper[4835]: E0319 10:04:07.403072 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:04:20 crc kubenswrapper[4835]: I0319 10:04:20.402525 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:04:20 crc kubenswrapper[4835]: E0319 10:04:20.403492 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:04:28 crc kubenswrapper[4835]: I0319 10:04:28.462637 4835 generic.go:334] "Generic (PLEG): container finished" podID="64fbb382-1146-4c37-ad0a-596ce57ae0f5" containerID="c1d89963cf5ffebec6020ba6037028ebc961eee516e065205f9fd5364e6e52b9" exitCode=0 Mar 19 10:04:28 crc kubenswrapper[4835]: I0319 10:04:28.462895 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" event={"ID":"64fbb382-1146-4c37-ad0a-596ce57ae0f5","Type":"ContainerDied","Data":"c1d89963cf5ffebec6020ba6037028ebc961eee516e065205f9fd5364e6e52b9"} Mar 19 10:04:29 crc kubenswrapper[4835]: I0319 10:04:29.958157 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.131407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgphq\" (UniqueName: \"kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.132938 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.132980 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.133036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.133146 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.133189 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam\") pod \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\" (UID: \"64fbb382-1146-4c37-ad0a-596ce57ae0f5\") " Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.140358 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq" (OuterVolumeSpecName: "kube-api-access-rgphq") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "kube-api-access-rgphq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.140473 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.172070 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.172464 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.174416 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.175665 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory" (OuterVolumeSpecName: "inventory") pod "64fbb382-1146-4c37-ad0a-596ce57ae0f5" (UID: "64fbb382-1146-4c37-ad0a-596ce57ae0f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237179 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgphq\" (UniqueName: \"kubernetes.io/projected/64fbb382-1146-4c37-ad0a-596ce57ae0f5-kube-api-access-rgphq\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237234 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237256 4835 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237280 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237299 4835 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.237320 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64fbb382-1146-4c37-ad0a-596ce57ae0f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.487662 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" event={"ID":"64fbb382-1146-4c37-ad0a-596ce57ae0f5","Type":"ContainerDied","Data":"42beaa54abdddfd143ff8645cfd4a295443f069274728ef23888eb5fca8ca1d0"} Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.487714 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42beaa54abdddfd143ff8645cfd4a295443f069274728ef23888eb5fca8ca1d0" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.487754 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.638888 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx"] Mar 19 10:04:30 crc kubenswrapper[4835]: E0319 10:04:30.639570 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64fbb382-1146-4c37-ad0a-596ce57ae0f5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.639599 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="64fbb382-1146-4c37-ad0a-596ce57ae0f5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:04:30 crc kubenswrapper[4835]: E0319 10:04:30.639623 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ef0143-6ad8-4c19-80a9-6590e5aa8943" containerName="oc" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.639632 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ef0143-6ad8-4c19-80a9-6590e5aa8943" containerName="oc" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.639964 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="64fbb382-1146-4c37-ad0a-596ce57ae0f5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.640004 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ef0143-6ad8-4c19-80a9-6590e5aa8943" containerName="oc" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.640933 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.644769 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.644999 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.645168 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.645372 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.646534 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.648388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.648446 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.648712 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.648760 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.648806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn64j\" (UniqueName: \"kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.675253 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx"] Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.750062 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.750121 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.750177 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.750203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.750246 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn64j\" (UniqueName: \"kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.753986 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.754568 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.755659 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.759448 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.772311 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn64j\" (UniqueName: \"kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:30 crc kubenswrapper[4835]: I0319 10:04:30.967653 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:04:31 crc kubenswrapper[4835]: I0319 10:04:31.552798 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx"] Mar 19 10:04:32 crc kubenswrapper[4835]: I0319 10:04:32.510071 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" event={"ID":"fbf6d84b-d129-481a-80a4-b38c4eb9051b","Type":"ContainerStarted","Data":"28d6bbc6e61d93d64e16a42dd191aa9d2c876994208c0e1dde541c77f18fdd35"} Mar 19 10:04:32 crc kubenswrapper[4835]: I0319 10:04:32.510518 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" event={"ID":"fbf6d84b-d129-481a-80a4-b38c4eb9051b","Type":"ContainerStarted","Data":"cbb271c8ee8e10240cd9c94ba04b78e8b7467d31dcfb4efc6cff710300ae74c5"} Mar 19 10:04:32 crc kubenswrapper[4835]: I0319 10:04:32.542246 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" podStartSLOduration=2.015761466 podStartE2EDuration="2.542214727s" podCreationTimestamp="2026-03-19 10:04:30 +0000 UTC" firstStartedPulling="2026-03-19 10:04:31.561574016 +0000 UTC m=+2526.410172603" lastFinishedPulling="2026-03-19 10:04:32.088027267 +0000 UTC m=+2526.936625864" observedRunningTime="2026-03-19 10:04:32.530276888 +0000 UTC m=+2527.378875475" watchObservedRunningTime="2026-03-19 10:04:32.542214727 +0000 UTC m=+2527.390813314" Mar 19 10:04:33 crc kubenswrapper[4835]: I0319 10:04:33.402108 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:04:33 crc kubenswrapper[4835]: E0319 10:04:33.402647 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:04:47 crc kubenswrapper[4835]: I0319 10:04:47.402951 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:04:47 crc kubenswrapper[4835]: E0319 10:04:47.403874 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:04:56 crc kubenswrapper[4835]: I0319 10:04:56.527992 4835 scope.go:117] "RemoveContainer" containerID="f6b75343ca481c064668c749bb81ab1be942e8b7b81f8ead4f5c6b377818aba7" Mar 19 10:05:01 crc kubenswrapper[4835]: I0319 10:05:01.402189 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:05:01 crc kubenswrapper[4835]: E0319 10:05:01.403028 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:05:12 crc kubenswrapper[4835]: I0319 10:05:12.401779 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:05:12 crc kubenswrapper[4835]: E0319 10:05:12.403672 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:05:24 crc kubenswrapper[4835]: I0319 10:05:24.402926 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:05:24 crc kubenswrapper[4835]: E0319 10:05:24.405166 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:05:38 crc kubenswrapper[4835]: I0319 10:05:38.402767 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:05:39 crc kubenswrapper[4835]: I0319 10:05:39.251845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970"} Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.145766 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565246-sw968"] Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.148026 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.149970 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.150981 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.155885 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.175382 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565246-sw968"] Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.234319 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295b8\" (UniqueName: \"kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8\") pod \"auto-csr-approver-29565246-sw968\" (UID: \"1ba8c157-3eaa-4cc4-8e66-9cdab565c653\") " pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.336771 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295b8\" (UniqueName: \"kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8\") pod \"auto-csr-approver-29565246-sw968\" (UID: \"1ba8c157-3eaa-4cc4-8e66-9cdab565c653\") " pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.361655 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295b8\" (UniqueName: \"kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8\") pod \"auto-csr-approver-29565246-sw968\" (UID: \"1ba8c157-3eaa-4cc4-8e66-9cdab565c653\") " pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.471424 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:00 crc kubenswrapper[4835]: I0319 10:06:00.930281 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565246-sw968"] Mar 19 10:06:01 crc kubenswrapper[4835]: I0319 10:06:01.496237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565246-sw968" event={"ID":"1ba8c157-3eaa-4cc4-8e66-9cdab565c653","Type":"ContainerStarted","Data":"6d3060d6d1ac21ed3d4acdb29227069eedf2f3a3e6f7d2278539d6028a3ef62f"} Mar 19 10:06:02 crc kubenswrapper[4835]: I0319 10:06:02.509734 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565246-sw968" event={"ID":"1ba8c157-3eaa-4cc4-8e66-9cdab565c653","Type":"ContainerStarted","Data":"29ba2cc3838173af5b229dd30378b14bcaaea6655f7a112b2dedf56ab8dc3711"} Mar 19 10:06:02 crc kubenswrapper[4835]: I0319 10:06:02.529379 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565246-sw968" podStartSLOduration=1.615945084 podStartE2EDuration="2.529359532s" podCreationTimestamp="2026-03-19 10:06:00 +0000 UTC" firstStartedPulling="2026-03-19 10:06:00.927076471 +0000 UTC m=+2615.775675058" lastFinishedPulling="2026-03-19 10:06:01.840490919 +0000 UTC m=+2616.689089506" observedRunningTime="2026-03-19 10:06:02.522114954 +0000 UTC m=+2617.370713541" watchObservedRunningTime="2026-03-19 10:06:02.529359532 +0000 UTC m=+2617.377958119" Mar 19 10:06:03 crc kubenswrapper[4835]: I0319 10:06:03.521955 4835 generic.go:334] "Generic (PLEG): container finished" podID="1ba8c157-3eaa-4cc4-8e66-9cdab565c653" containerID="29ba2cc3838173af5b229dd30378b14bcaaea6655f7a112b2dedf56ab8dc3711" exitCode=0 Mar 19 10:06:03 crc kubenswrapper[4835]: I0319 10:06:03.522016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565246-sw968" event={"ID":"1ba8c157-3eaa-4cc4-8e66-9cdab565c653","Type":"ContainerDied","Data":"29ba2cc3838173af5b229dd30378b14bcaaea6655f7a112b2dedf56ab8dc3711"} Mar 19 10:06:04 crc kubenswrapper[4835]: I0319 10:06:04.962912 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.061319 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295b8\" (UniqueName: \"kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8\") pod \"1ba8c157-3eaa-4cc4-8e66-9cdab565c653\" (UID: \"1ba8c157-3eaa-4cc4-8e66-9cdab565c653\") " Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.085319 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8" (OuterVolumeSpecName: "kube-api-access-295b8") pod "1ba8c157-3eaa-4cc4-8e66-9cdab565c653" (UID: "1ba8c157-3eaa-4cc4-8e66-9cdab565c653"). InnerVolumeSpecName "kube-api-access-295b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.164814 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295b8\" (UniqueName: \"kubernetes.io/projected/1ba8c157-3eaa-4cc4-8e66-9cdab565c653-kube-api-access-295b8\") on node \"crc\" DevicePath \"\"" Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.543669 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565246-sw968" event={"ID":"1ba8c157-3eaa-4cc4-8e66-9cdab565c653","Type":"ContainerDied","Data":"6d3060d6d1ac21ed3d4acdb29227069eedf2f3a3e6f7d2278539d6028a3ef62f"} Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.543717 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d3060d6d1ac21ed3d4acdb29227069eedf2f3a3e6f7d2278539d6028a3ef62f" Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.543721 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565246-sw968" Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.611533 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565240-rvtrz"] Mar 19 10:06:05 crc kubenswrapper[4835]: I0319 10:06:05.622407 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565240-rvtrz"] Mar 19 10:06:06 crc kubenswrapper[4835]: I0319 10:06:06.418179 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49cc6c72-e73e-47a0-9f4d-f66f7a47bf48" path="/var/lib/kubelet/pods/49cc6c72-e73e-47a0-9f4d-f66f7a47bf48/volumes" Mar 19 10:06:56 crc kubenswrapper[4835]: I0319 10:06:56.629921 4835 scope.go:117] "RemoveContainer" containerID="f5ddcb8280af6869e8aab956ca92dccdfbe1e9ecd1410e663126b231e0bd8dcd" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.150674 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565248-rgkz9"] Mar 19 10:08:00 crc kubenswrapper[4835]: E0319 10:08:00.153203 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba8c157-3eaa-4cc4-8e66-9cdab565c653" containerName="oc" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.153296 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba8c157-3eaa-4cc4-8e66-9cdab565c653" containerName="oc" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.153610 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba8c157-3eaa-4cc4-8e66-9cdab565c653" containerName="oc" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.154490 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.156902 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.157038 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.157136 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.188231 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565248-rgkz9"] Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.282508 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttk7\" (UniqueName: \"kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7\") pod \"auto-csr-approver-29565248-rgkz9\" (UID: \"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2\") " pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.384713 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttk7\" (UniqueName: \"kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7\") pod \"auto-csr-approver-29565248-rgkz9\" (UID: \"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2\") " pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.409237 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttk7\" (UniqueName: \"kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7\") pod \"auto-csr-approver-29565248-rgkz9\" (UID: \"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2\") " pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.473267 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:00 crc kubenswrapper[4835]: I0319 10:08:00.923814 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565248-rgkz9"] Mar 19 10:08:00 crc kubenswrapper[4835]: W0319 10:08:00.925327 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec4c69ae_8beb_4db1_b676_1dd6701aa6d2.slice/crio-1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04 WatchSource:0}: Error finding container 1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04: Status 404 returned error can't find the container with id 1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04 Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.678226 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" event={"ID":"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2","Type":"ContainerStarted","Data":"1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04"} Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.709084 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.713914 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.751285 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.819229 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.819426 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.819480 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh8h9\" (UniqueName: \"kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.921497 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.921590 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh8h9\" (UniqueName: \"kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.921701 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.922051 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.922245 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:01 crc kubenswrapper[4835]: I0319 10:08:01.944831 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh8h9\" (UniqueName: \"kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9\") pod \"certified-operators-2cwcm\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:02 crc kubenswrapper[4835]: I0319 10:08:02.049421 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:02 crc kubenswrapper[4835]: I0319 10:08:02.648487 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:02 crc kubenswrapper[4835]: I0319 10:08:02.755929 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" event={"ID":"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2","Type":"ContainerStarted","Data":"36815eb4e86923a5b03874c63d9f97eff61679746796609fd2e6d059968c3d87"} Mar 19 10:08:03 crc kubenswrapper[4835]: I0319 10:08:03.773947 4835 generic.go:334] "Generic (PLEG): container finished" podID="ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" containerID="36815eb4e86923a5b03874c63d9f97eff61679746796609fd2e6d059968c3d87" exitCode=0 Mar 19 10:08:03 crc kubenswrapper[4835]: I0319 10:08:03.774052 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" event={"ID":"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2","Type":"ContainerDied","Data":"36815eb4e86923a5b03874c63d9f97eff61679746796609fd2e6d059968c3d87"} Mar 19 10:08:03 crc kubenswrapper[4835]: I0319 10:08:03.789979 4835 generic.go:334] "Generic (PLEG): container finished" podID="10133653-6eb3-4fa3-ac17-e9144421775f" containerID="85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20" exitCode=0 Mar 19 10:08:03 crc kubenswrapper[4835]: I0319 10:08:03.790040 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerDied","Data":"85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20"} Mar 19 10:08:03 crc kubenswrapper[4835]: I0319 10:08:03.790066 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerStarted","Data":"9f1e650d7a03e2ef7fb8d90b78f810a992b0035079d06f76deaed8cda095f25c"} Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.228984 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.312044 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ttk7\" (UniqueName: \"kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7\") pod \"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2\" (UID: \"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2\") " Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.318573 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7" (OuterVolumeSpecName: "kube-api-access-7ttk7") pod "ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" (UID: "ec4c69ae-8beb-4db1-b676-1dd6701aa6d2"). InnerVolumeSpecName "kube-api-access-7ttk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.418268 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ttk7\" (UniqueName: \"kubernetes.io/projected/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2-kube-api-access-7ttk7\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.810601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" event={"ID":"ec4c69ae-8beb-4db1-b676-1dd6701aa6d2","Type":"ContainerDied","Data":"1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04"} Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.810636 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d0dccfa3c20520dd8efd7736ee084aa32a8bbbbcbc9211dbfb0d5833ba6ac04" Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.810670 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565248-rgkz9" Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.813127 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerStarted","Data":"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441"} Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.860920 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565242-58qhn"] Mar 19 10:08:05 crc kubenswrapper[4835]: I0319 10:08:05.870551 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565242-58qhn"] Mar 19 10:08:06 crc kubenswrapper[4835]: I0319 10:08:06.419459 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6df688-8714-4569-90ee-ef7e8c1d2a8c" path="/var/lib/kubelet/pods/0c6df688-8714-4569-90ee-ef7e8c1d2a8c/volumes" Mar 19 10:08:06 crc kubenswrapper[4835]: I0319 10:08:06.422371 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:08:06 crc kubenswrapper[4835]: I0319 10:08:06.422424 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:08:07 crc kubenswrapper[4835]: I0319 10:08:07.843910 4835 generic.go:334] "Generic (PLEG): container finished" podID="10133653-6eb3-4fa3-ac17-e9144421775f" containerID="03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441" exitCode=0 Mar 19 10:08:07 crc kubenswrapper[4835]: I0319 10:08:07.843950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerDied","Data":"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441"} Mar 19 10:08:08 crc kubenswrapper[4835]: I0319 10:08:08.860601 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerStarted","Data":"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2"} Mar 19 10:08:08 crc kubenswrapper[4835]: I0319 10:08:08.899015 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2cwcm" podStartSLOduration=3.410061409 podStartE2EDuration="7.898983419s" podCreationTimestamp="2026-03-19 10:08:01 +0000 UTC" firstStartedPulling="2026-03-19 10:08:03.796370454 +0000 UTC m=+2738.644969041" lastFinishedPulling="2026-03-19 10:08:08.285292464 +0000 UTC m=+2743.133891051" observedRunningTime="2026-03-19 10:08:08.885479675 +0000 UTC m=+2743.734078272" watchObservedRunningTime="2026-03-19 10:08:08.898983419 +0000 UTC m=+2743.747582016" Mar 19 10:08:09 crc kubenswrapper[4835]: I0319 10:08:09.892063 4835 generic.go:334] "Generic (PLEG): container finished" podID="fbf6d84b-d129-481a-80a4-b38c4eb9051b" containerID="28d6bbc6e61d93d64e16a42dd191aa9d2c876994208c0e1dde541c77f18fdd35" exitCode=0 Mar 19 10:08:09 crc kubenswrapper[4835]: I0319 10:08:09.892124 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" event={"ID":"fbf6d84b-d129-481a-80a4-b38c4eb9051b","Type":"ContainerDied","Data":"28d6bbc6e61d93d64e16a42dd191aa9d2c876994208c0e1dde541c77f18fdd35"} Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.429143 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.446360 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam\") pod \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.446449 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory\") pod \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.446478 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0\") pod \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.446601 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn64j\" (UniqueName: \"kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j\") pod \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.452197 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j" (OuterVolumeSpecName: "kube-api-access-wn64j") pod "fbf6d84b-d129-481a-80a4-b38c4eb9051b" (UID: "fbf6d84b-d129-481a-80a4-b38c4eb9051b"). InnerVolumeSpecName "kube-api-access-wn64j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.477275 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory" (OuterVolumeSpecName: "inventory") pod "fbf6d84b-d129-481a-80a4-b38c4eb9051b" (UID: "fbf6d84b-d129-481a-80a4-b38c4eb9051b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.485025 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fbf6d84b-d129-481a-80a4-b38c4eb9051b" (UID: "fbf6d84b-d129-481a-80a4-b38c4eb9051b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.486196 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fbf6d84b-d129-481a-80a4-b38c4eb9051b" (UID: "fbf6d84b-d129-481a-80a4-b38c4eb9051b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.548622 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle\") pod \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\" (UID: \"fbf6d84b-d129-481a-80a4-b38c4eb9051b\") " Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.549074 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.549093 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.549142 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.549154 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn64j\" (UniqueName: \"kubernetes.io/projected/fbf6d84b-d129-481a-80a4-b38c4eb9051b-kube-api-access-wn64j\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.553867 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fbf6d84b-d129-481a-80a4-b38c4eb9051b" (UID: "fbf6d84b-d129-481a-80a4-b38c4eb9051b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.650607 4835 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf6d84b-d129-481a-80a4-b38c4eb9051b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.916349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" event={"ID":"fbf6d84b-d129-481a-80a4-b38c4eb9051b","Type":"ContainerDied","Data":"cbb271c8ee8e10240cd9c94ba04b78e8b7467d31dcfb4efc6cff710300ae74c5"} Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.916640 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbb271c8ee8e10240cd9c94ba04b78e8b7467d31dcfb4efc6cff710300ae74c5" Mar 19 10:08:11 crc kubenswrapper[4835]: I0319 10:08:11.916406 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.001203 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t"] Mar 19 10:08:12 crc kubenswrapper[4835]: E0319 10:08:12.002144 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf6d84b-d129-481a-80a4-b38c4eb9051b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.002169 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf6d84b-d129-481a-80a4-b38c4eb9051b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 10:08:12 crc kubenswrapper[4835]: E0319 10:08:12.002224 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" containerName="oc" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.002236 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" containerName="oc" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.002565 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf6d84b-d129-481a-80a4-b38c4eb9051b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.002580 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" containerName="oc" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.003433 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.011782 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.011900 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.011803 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.011987 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.012133 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.012324 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.012407 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.015282 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t"] Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.049977 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.050583 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163291 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163356 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163397 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163503 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163522 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163575 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9rn\" (UniqueName: \"kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163599 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163652 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163675 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.163703 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266173 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9rn\" (UniqueName: \"kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266247 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266400 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266471 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266525 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266594 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266679 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266857 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266932 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.266968 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.267552 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.270975 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.271433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.272579 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.277326 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.282377 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.282436 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.282701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.282712 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.282843 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.285533 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9rn\" (UniqueName: \"kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gw97t\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:12 crc kubenswrapper[4835]: I0319 10:08:12.334643 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:08:13 crc kubenswrapper[4835]: I0319 10:08:13.003947 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t"] Mar 19 10:08:13 crc kubenswrapper[4835]: I0319 10:08:13.114209 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-2cwcm" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:08:13 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:08:13 crc kubenswrapper[4835]: > Mar 19 10:08:13 crc kubenswrapper[4835]: I0319 10:08:13.943265 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" event={"ID":"316e9619-05d7-4845-84e5-78700d3318d9","Type":"ContainerStarted","Data":"cd07d3d6194c9506c998ac88f652f97d4e0372c7f3ab5d0c9cb8f4d37f8f8e7a"} Mar 19 10:08:13 crc kubenswrapper[4835]: I0319 10:08:13.943898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" event={"ID":"316e9619-05d7-4845-84e5-78700d3318d9","Type":"ContainerStarted","Data":"ea5f3cd558919a514dfe534cceed7cb720496975b6dff3b4997c77e3e6bb64db"} Mar 19 10:08:13 crc kubenswrapper[4835]: I0319 10:08:13.976398 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" podStartSLOduration=2.509189662 podStartE2EDuration="2.976375904s" podCreationTimestamp="2026-03-19 10:08:11 +0000 UTC" firstStartedPulling="2026-03-19 10:08:13.003096293 +0000 UTC m=+2747.851694880" lastFinishedPulling="2026-03-19 10:08:13.470282525 +0000 UTC m=+2748.318881122" observedRunningTime="2026-03-19 10:08:13.966620762 +0000 UTC m=+2748.815219359" watchObservedRunningTime="2026-03-19 10:08:13.976375904 +0000 UTC m=+2748.824974491" Mar 19 10:08:22 crc kubenswrapper[4835]: I0319 10:08:22.097690 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:22 crc kubenswrapper[4835]: I0319 10:08:22.147923 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:22 crc kubenswrapper[4835]: I0319 10:08:22.340225 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:23 crc kubenswrapper[4835]: I0319 10:08:23.547297 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2cwcm" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="registry-server" containerID="cri-o://27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2" gracePeriod=2 Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.046086 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.108383 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities\") pod \"10133653-6eb3-4fa3-ac17-e9144421775f\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.108475 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content\") pod \"10133653-6eb3-4fa3-ac17-e9144421775f\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.108698 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh8h9\" (UniqueName: \"kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9\") pod \"10133653-6eb3-4fa3-ac17-e9144421775f\" (UID: \"10133653-6eb3-4fa3-ac17-e9144421775f\") " Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.109646 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities" (OuterVolumeSpecName: "utilities") pod "10133653-6eb3-4fa3-ac17-e9144421775f" (UID: "10133653-6eb3-4fa3-ac17-e9144421775f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.118212 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9" (OuterVolumeSpecName: "kube-api-access-qh8h9") pod "10133653-6eb3-4fa3-ac17-e9144421775f" (UID: "10133653-6eb3-4fa3-ac17-e9144421775f"). InnerVolumeSpecName "kube-api-access-qh8h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.171396 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10133653-6eb3-4fa3-ac17-e9144421775f" (UID: "10133653-6eb3-4fa3-ac17-e9144421775f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.212793 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh8h9\" (UniqueName: \"kubernetes.io/projected/10133653-6eb3-4fa3-ac17-e9144421775f-kube-api-access-qh8h9\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.212833 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.212845 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10133653-6eb3-4fa3-ac17-e9144421775f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.564543 4835 generic.go:334] "Generic (PLEG): container finished" podID="10133653-6eb3-4fa3-ac17-e9144421775f" containerID="27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2" exitCode=0 Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.564605 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerDied","Data":"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2"} Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.564618 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cwcm" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.564660 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cwcm" event={"ID":"10133653-6eb3-4fa3-ac17-e9144421775f","Type":"ContainerDied","Data":"9f1e650d7a03e2ef7fb8d90b78f810a992b0035079d06f76deaed8cda095f25c"} Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.564688 4835 scope.go:117] "RemoveContainer" containerID="27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.593391 4835 scope.go:117] "RemoveContainer" containerID="03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.603810 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.612594 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2cwcm"] Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.629933 4835 scope.go:117] "RemoveContainer" containerID="85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.702410 4835 scope.go:117] "RemoveContainer" containerID="27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2" Mar 19 10:08:24 crc kubenswrapper[4835]: E0319 10:08:24.702987 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2\": container with ID starting with 27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2 not found: ID does not exist" containerID="27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.703023 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2"} err="failed to get container status \"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2\": rpc error: code = NotFound desc = could not find container \"27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2\": container with ID starting with 27385a86158c5175620fac6eb4fda37a1aef91374ddfd9480100eb7189624eb2 not found: ID does not exist" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.703041 4835 scope.go:117] "RemoveContainer" containerID="03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441" Mar 19 10:08:24 crc kubenswrapper[4835]: E0319 10:08:24.703418 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441\": container with ID starting with 03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441 not found: ID does not exist" containerID="03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.703449 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441"} err="failed to get container status \"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441\": rpc error: code = NotFound desc = could not find container \"03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441\": container with ID starting with 03ccca78218a04cc5a264cf4b540128506fb64fa3ba0a06aa564fb71fa720441 not found: ID does not exist" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.703463 4835 scope.go:117] "RemoveContainer" containerID="85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20" Mar 19 10:08:24 crc kubenswrapper[4835]: E0319 10:08:24.703844 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20\": container with ID starting with 85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20 not found: ID does not exist" containerID="85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20" Mar 19 10:08:24 crc kubenswrapper[4835]: I0319 10:08:24.703872 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20"} err="failed to get container status \"85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20\": rpc error: code = NotFound desc = could not find container \"85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20\": container with ID starting with 85e8d07384435b36d8d610f991723b10e767aea9f7717050ee8d71d66da1af20 not found: ID does not exist" Mar 19 10:08:26 crc kubenswrapper[4835]: I0319 10:08:26.432355 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" path="/var/lib/kubelet/pods/10133653-6eb3-4fa3-ac17-e9144421775f/volumes" Mar 19 10:08:36 crc kubenswrapper[4835]: I0319 10:08:36.421971 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:08:36 crc kubenswrapper[4835]: I0319 10:08:36.422618 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:08:56 crc kubenswrapper[4835]: I0319 10:08:56.742915 4835 scope.go:117] "RemoveContainer" containerID="2fc30a46820a5b741aa124b55e9f2c6fedecf7ea5538addeecdffb0cb1a92882" Mar 19 10:09:06 crc kubenswrapper[4835]: I0319 10:09:06.421766 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:09:06 crc kubenswrapper[4835]: I0319 10:09:06.422356 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:09:06 crc kubenswrapper[4835]: I0319 10:09:06.422401 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:09:07 crc kubenswrapper[4835]: I0319 10:09:07.060338 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:09:07 crc kubenswrapper[4835]: I0319 10:09:07.060410 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970" gracePeriod=600 Mar 19 10:09:08 crc kubenswrapper[4835]: I0319 10:09:08.070842 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970" exitCode=0 Mar 19 10:09:08 crc kubenswrapper[4835]: I0319 10:09:08.070909 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970"} Mar 19 10:09:08 crc kubenswrapper[4835]: I0319 10:09:08.071515 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7"} Mar 19 10:09:08 crc kubenswrapper[4835]: I0319 10:09:08.071537 4835 scope.go:117] "RemoveContainer" containerID="3c5b8661f3f050fbf34728c9e88ed710ff7c292af499338ff01295afa288217f" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.795450 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:09:15 crc kubenswrapper[4835]: E0319 10:09:15.796707 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="registry-server" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.796728 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="registry-server" Mar 19 10:09:15 crc kubenswrapper[4835]: E0319 10:09:15.796769 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="extract-utilities" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.796779 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="extract-utilities" Mar 19 10:09:15 crc kubenswrapper[4835]: E0319 10:09:15.796824 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="extract-content" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.796833 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="extract-content" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.797101 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="10133653-6eb3-4fa3-ac17-e9144421775f" containerName="registry-server" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.799304 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.811807 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.971826 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.971901 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:15 crc kubenswrapper[4835]: I0319 10:09:15.972023 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdvnp\" (UniqueName: \"kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.074174 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdvnp\" (UniqueName: \"kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.074431 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.074484 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.074999 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.075021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.101601 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdvnp\" (UniqueName: \"kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp\") pod \"redhat-operators-nhwjp\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.129257 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:16 crc kubenswrapper[4835]: I0319 10:09:16.695089 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:09:17 crc kubenswrapper[4835]: I0319 10:09:17.182493 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2608989-3d54-4714-a277-e21e5164d067" containerID="0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2" exitCode=0 Mar 19 10:09:17 crc kubenswrapper[4835]: I0319 10:09:17.182560 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerDied","Data":"0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2"} Mar 19 10:09:17 crc kubenswrapper[4835]: I0319 10:09:17.182880 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerStarted","Data":"75510e29e8f4aef10d4780ed041019ab30528e7fe57bbaaac9ae838c2842d329"} Mar 19 10:09:17 crc kubenswrapper[4835]: I0319 10:09:17.185078 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:09:19 crc kubenswrapper[4835]: I0319 10:09:19.216898 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerStarted","Data":"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff"} Mar 19 10:09:25 crc kubenswrapper[4835]: I0319 10:09:25.301601 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2608989-3d54-4714-a277-e21e5164d067" containerID="db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff" exitCode=0 Mar 19 10:09:25 crc kubenswrapper[4835]: I0319 10:09:25.301690 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerDied","Data":"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff"} Mar 19 10:09:26 crc kubenswrapper[4835]: I0319 10:09:26.313913 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerStarted","Data":"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189"} Mar 19 10:09:26 crc kubenswrapper[4835]: I0319 10:09:26.344723 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nhwjp" podStartSLOduration=2.586089241 podStartE2EDuration="11.34470332s" podCreationTimestamp="2026-03-19 10:09:15 +0000 UTC" firstStartedPulling="2026-03-19 10:09:17.184722531 +0000 UTC m=+2812.033321118" lastFinishedPulling="2026-03-19 10:09:25.94333658 +0000 UTC m=+2820.791935197" observedRunningTime="2026-03-19 10:09:26.333366805 +0000 UTC m=+2821.181965402" watchObservedRunningTime="2026-03-19 10:09:26.34470332 +0000 UTC m=+2821.193301907" Mar 19 10:09:36 crc kubenswrapper[4835]: I0319 10:09:36.130375 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:36 crc kubenswrapper[4835]: I0319 10:09:36.130899 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:09:37 crc kubenswrapper[4835]: I0319 10:09:37.177864 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhwjp" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" probeResult="failure" output=< Mar 19 10:09:37 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:09:37 crc kubenswrapper[4835]: > Mar 19 10:09:47 crc kubenswrapper[4835]: I0319 10:09:47.190048 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhwjp" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" probeResult="failure" output=< Mar 19 10:09:47 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:09:47 crc kubenswrapper[4835]: > Mar 19 10:09:57 crc kubenswrapper[4835]: I0319 10:09:57.184428 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nhwjp" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" probeResult="failure" output=< Mar 19 10:09:57 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:09:57 crc kubenswrapper[4835]: > Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.151836 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565250-bdm26"] Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.155598 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.158073 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.158499 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.158551 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.184706 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565250-bdm26"] Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.305787 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs4k2\" (UniqueName: \"kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2\") pod \"auto-csr-approver-29565250-bdm26\" (UID: \"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8\") " pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.407812 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs4k2\" (UniqueName: \"kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2\") pod \"auto-csr-approver-29565250-bdm26\" (UID: \"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8\") " pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.433225 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs4k2\" (UniqueName: \"kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2\") pod \"auto-csr-approver-29565250-bdm26\" (UID: \"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8\") " pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:00 crc kubenswrapper[4835]: I0319 10:10:00.484182 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:01 crc kubenswrapper[4835]: I0319 10:10:01.007792 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565250-bdm26"] Mar 19 10:10:01 crc kubenswrapper[4835]: I0319 10:10:01.720991 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565250-bdm26" event={"ID":"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8","Type":"ContainerStarted","Data":"8b5849a8684532e6ac04a6f472e203a917ac32f1d7e9a6bb51f684a04007e020"} Mar 19 10:10:03 crc kubenswrapper[4835]: I0319 10:10:03.745371 4835 generic.go:334] "Generic (PLEG): container finished" podID="e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" containerID="6a9c24f4c7e436ccc9d6bf45977fe9b7aeee71a0c249b317db620097ce733e76" exitCode=0 Mar 19 10:10:03 crc kubenswrapper[4835]: I0319 10:10:03.745425 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565250-bdm26" event={"ID":"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8","Type":"ContainerDied","Data":"6a9c24f4c7e436ccc9d6bf45977fe9b7aeee71a0c249b317db620097ce733e76"} Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.280827 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.431097 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs4k2\" (UniqueName: \"kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2\") pod \"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8\" (UID: \"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8\") " Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.440770 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2" (OuterVolumeSpecName: "kube-api-access-zs4k2") pod "e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" (UID: "e4232fb3-0e4c-40a6-b0c4-ac32dea524d8"). InnerVolumeSpecName "kube-api-access-zs4k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.535562 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs4k2\" (UniqueName: \"kubernetes.io/projected/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8-kube-api-access-zs4k2\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.772692 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565250-bdm26" event={"ID":"e4232fb3-0e4c-40a6-b0c4-ac32dea524d8","Type":"ContainerDied","Data":"8b5849a8684532e6ac04a6f472e203a917ac32f1d7e9a6bb51f684a04007e020"} Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.772803 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565250-bdm26" Mar 19 10:10:05 crc kubenswrapper[4835]: I0319 10:10:05.772813 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5849a8684532e6ac04a6f472e203a917ac32f1d7e9a6bb51f684a04007e020" Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.184484 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.238641 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.354958 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565244-dbl8x"] Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.365893 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565244-dbl8x"] Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.421026 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ef0143-6ad8-4c19-80a9-6590e5aa8943" path="/var/lib/kubelet/pods/c4ef0143-6ad8-4c19-80a9-6590e5aa8943/volumes" Mar 19 10:10:06 crc kubenswrapper[4835]: I0319 10:10:06.436052 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:10:07 crc kubenswrapper[4835]: I0319 10:10:07.789900 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nhwjp" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" containerID="cri-o://92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189" gracePeriod=2 Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.406400 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.509367 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities\") pod \"a2608989-3d54-4714-a277-e21e5164d067\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.509442 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdvnp\" (UniqueName: \"kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp\") pod \"a2608989-3d54-4714-a277-e21e5164d067\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.509608 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content\") pod \"a2608989-3d54-4714-a277-e21e5164d067\" (UID: \"a2608989-3d54-4714-a277-e21e5164d067\") " Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.510439 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities" (OuterVolumeSpecName: "utilities") pod "a2608989-3d54-4714-a277-e21e5164d067" (UID: "a2608989-3d54-4714-a277-e21e5164d067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.515978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp" (OuterVolumeSpecName: "kube-api-access-jdvnp") pod "a2608989-3d54-4714-a277-e21e5164d067" (UID: "a2608989-3d54-4714-a277-e21e5164d067"). InnerVolumeSpecName "kube-api-access-jdvnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.614031 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.614068 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdvnp\" (UniqueName: \"kubernetes.io/projected/a2608989-3d54-4714-a277-e21e5164d067-kube-api-access-jdvnp\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.642451 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2608989-3d54-4714-a277-e21e5164d067" (UID: "a2608989-3d54-4714-a277-e21e5164d067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.716476 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2608989-3d54-4714-a277-e21e5164d067-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.807927 4835 generic.go:334] "Generic (PLEG): container finished" podID="a2608989-3d54-4714-a277-e21e5164d067" containerID="92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189" exitCode=0 Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.807972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerDied","Data":"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189"} Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.807998 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nhwjp" event={"ID":"a2608989-3d54-4714-a277-e21e5164d067","Type":"ContainerDied","Data":"75510e29e8f4aef10d4780ed041019ab30528e7fe57bbaaac9ae838c2842d329"} Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.808017 4835 scope.go:117] "RemoveContainer" containerID="92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.808156 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nhwjp" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.857847 4835 scope.go:117] "RemoveContainer" containerID="db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.886797 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.950904 4835 scope.go:117] "RemoveContainer" containerID="0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2" Mar 19 10:10:08 crc kubenswrapper[4835]: I0319 10:10:08.972084 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nhwjp"] Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.051891 4835 scope.go:117] "RemoveContainer" containerID="92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189" Mar 19 10:10:09 crc kubenswrapper[4835]: E0319 10:10:09.052320 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189\": container with ID starting with 92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189 not found: ID does not exist" containerID="92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189" Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.052346 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189"} err="failed to get container status \"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189\": rpc error: code = NotFound desc = could not find container \"92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189\": container with ID starting with 92ed4400c014eca64687eebd582c2ba31290c277debeaad82ec5dad1e7a5c189 not found: ID does not exist" Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.052376 4835 scope.go:117] "RemoveContainer" containerID="db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff" Mar 19 10:10:09 crc kubenswrapper[4835]: E0319 10:10:09.052541 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff\": container with ID starting with db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff not found: ID does not exist" containerID="db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff" Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.052558 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff"} err="failed to get container status \"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff\": rpc error: code = NotFound desc = could not find container \"db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff\": container with ID starting with db4506cd525fe12cbdfa39f02924598602e85650bef45151fe785f62ab7ec6ff not found: ID does not exist" Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.052570 4835 scope.go:117] "RemoveContainer" containerID="0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2" Mar 19 10:10:09 crc kubenswrapper[4835]: E0319 10:10:09.052731 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2\": container with ID starting with 0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2 not found: ID does not exist" containerID="0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2" Mar 19 10:10:09 crc kubenswrapper[4835]: I0319 10:10:09.052769 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2"} err="failed to get container status \"0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2\": rpc error: code = NotFound desc = could not find container \"0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2\": container with ID starting with 0fba17163ebb5c6090d9e7c71a1975e268474e71d18bd93148d2385dcb414fe2 not found: ID does not exist" Mar 19 10:10:10 crc kubenswrapper[4835]: I0319 10:10:10.415002 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2608989-3d54-4714-a277-e21e5164d067" path="/var/lib/kubelet/pods/a2608989-3d54-4714-a277-e21e5164d067/volumes" Mar 19 10:10:25 crc kubenswrapper[4835]: I0319 10:10:25.998249 4835 generic.go:334] "Generic (PLEG): container finished" podID="316e9619-05d7-4845-84e5-78700d3318d9" containerID="cd07d3d6194c9506c998ac88f652f97d4e0372c7f3ab5d0c9cb8f4d37f8f8e7a" exitCode=0 Mar 19 10:10:25 crc kubenswrapper[4835]: I0319 10:10:25.998358 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" event={"ID":"316e9619-05d7-4845-84e5-78700d3318d9","Type":"ContainerDied","Data":"cd07d3d6194c9506c998ac88f652f97d4e0372c7f3ab5d0c9cb8f4d37f8f8e7a"} Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.526708 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626104 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626220 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626277 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626298 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626329 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626407 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626452 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9rn\" (UniqueName: \"kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626475 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626510 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626552 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.626574 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1\") pod \"316e9619-05d7-4845-84e5-78700d3318d9\" (UID: \"316e9619-05d7-4845-84e5-78700d3318d9\") " Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.631790 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.661861 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn" (OuterVolumeSpecName: "kube-api-access-6q9rn") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "kube-api-access-6q9rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.665486 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.671695 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.673979 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.674971 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.677551 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory" (OuterVolumeSpecName: "inventory") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.681142 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.690895 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.695133 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.700231 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "316e9619-05d7-4845-84e5-78700d3318d9" (UID: "316e9619-05d7-4845-84e5-78700d3318d9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730616 4835 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730663 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730677 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730689 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730703 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730717 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9rn\" (UniqueName: \"kubernetes.io/projected/316e9619-05d7-4845-84e5-78700d3318d9-kube-api-access-6q9rn\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730729 4835 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/316e9619-05d7-4845-84e5-78700d3318d9-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730757 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730773 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730787 4835 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:27 crc kubenswrapper[4835]: I0319 10:10:27.730799 4835 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/316e9619-05d7-4845-84e5-78700d3318d9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.031058 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" event={"ID":"316e9619-05d7-4845-84e5-78700d3318d9","Type":"ContainerDied","Data":"ea5f3cd558919a514dfe534cceed7cb720496975b6dff3b4997c77e3e6bb64db"} Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.031108 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea5f3cd558919a514dfe534cceed7cb720496975b6dff3b4997c77e3e6bb64db" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.031069 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gw97t" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.152863 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm"] Mar 19 10:10:28 crc kubenswrapper[4835]: E0319 10:10:28.153421 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="extract-utilities" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153439 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="extract-utilities" Mar 19 10:10:28 crc kubenswrapper[4835]: E0319 10:10:28.153454 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="extract-content" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153463 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="extract-content" Mar 19 10:10:28 crc kubenswrapper[4835]: E0319 10:10:28.153496 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316e9619-05d7-4845-84e5-78700d3318d9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153504 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="316e9619-05d7-4845-84e5-78700d3318d9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 10:10:28 crc kubenswrapper[4835]: E0319 10:10:28.153532 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153538 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" Mar 19 10:10:28 crc kubenswrapper[4835]: E0319 10:10:28.153547 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" containerName="oc" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153553 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" containerName="oc" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153817 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2608989-3d54-4714-a277-e21e5164d067" containerName="registry-server" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153846 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="316e9619-05d7-4845-84e5-78700d3318d9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.153877 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" containerName="oc" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.154980 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.158024 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.158051 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.158286 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.160614 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.160804 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.168963 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm"] Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243091 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243143 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8qpn\" (UniqueName: \"kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243167 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243373 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243510 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243553 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.243829 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.347503 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.347994 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.348023 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8qpn\" (UniqueName: \"kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.348047 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.348091 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.348131 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.348154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.352223 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.352221 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.352317 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.352855 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.353269 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.355370 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.366597 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8qpn\" (UniqueName: \"kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:28 crc kubenswrapper[4835]: I0319 10:10:28.521141 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:10:29 crc kubenswrapper[4835]: I0319 10:10:29.104844 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm"] Mar 19 10:10:30 crc kubenswrapper[4835]: I0319 10:10:30.058489 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" event={"ID":"f507ac6e-c9ef-4e19-a67c-f5358fe950b2","Type":"ContainerStarted","Data":"3b9b5c3e48c5935cc8b32cd9154befc7bb10bc88f398e4972cb20524533eabd8"} Mar 19 10:10:31 crc kubenswrapper[4835]: I0319 10:10:31.084317 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" event={"ID":"f507ac6e-c9ef-4e19-a67c-f5358fe950b2","Type":"ContainerStarted","Data":"2964e3268e33f98812223c79483154857b413dee21039b57c95a323fdb37dff8"} Mar 19 10:10:31 crc kubenswrapper[4835]: I0319 10:10:31.110337 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" podStartSLOduration=2.415098778 podStartE2EDuration="3.110316715s" podCreationTimestamp="2026-03-19 10:10:28 +0000 UTC" firstStartedPulling="2026-03-19 10:10:29.104731612 +0000 UTC m=+2883.953330199" lastFinishedPulling="2026-03-19 10:10:29.799949549 +0000 UTC m=+2884.648548136" observedRunningTime="2026-03-19 10:10:31.102637519 +0000 UTC m=+2885.951236106" watchObservedRunningTime="2026-03-19 10:10:31.110316715 +0000 UTC m=+2885.958915312" Mar 19 10:10:56 crc kubenswrapper[4835]: I0319 10:10:56.892135 4835 scope.go:117] "RemoveContainer" containerID="9c62d5d999c5b3fe94e59b31a99f6fed50cf1478bbda4cd28024838af8868935" Mar 19 10:11:36 crc kubenswrapper[4835]: I0319 10:11:36.421644 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:11:36 crc kubenswrapper[4835]: I0319 10:11:36.422087 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.146973 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565252-jnz9j"] Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.149915 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.154282 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.154540 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.155242 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.217449 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565252-jnz9j"] Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.244753 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmrgb\" (UniqueName: \"kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb\") pod \"auto-csr-approver-29565252-jnz9j\" (UID: \"0d33c0f9-aaab-4564-b0b1-590ff7356f4e\") " pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.346417 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmrgb\" (UniqueName: \"kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb\") pod \"auto-csr-approver-29565252-jnz9j\" (UID: \"0d33c0f9-aaab-4564-b0b1-590ff7356f4e\") " pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.367349 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmrgb\" (UniqueName: \"kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb\") pod \"auto-csr-approver-29565252-jnz9j\" (UID: \"0d33c0f9-aaab-4564-b0b1-590ff7356f4e\") " pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.492267 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:00 crc kubenswrapper[4835]: I0319 10:12:00.987613 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565252-jnz9j"] Mar 19 10:12:01 crc kubenswrapper[4835]: I0319 10:12:01.231721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" event={"ID":"0d33c0f9-aaab-4564-b0b1-590ff7356f4e","Type":"ContainerStarted","Data":"2cbf5a2f06d8cda78ff66e83f91c85f667ceac7912f4fc859e9b7cbc36e6b0fd"} Mar 19 10:12:03 crc kubenswrapper[4835]: I0319 10:12:03.288948 4835 generic.go:334] "Generic (PLEG): container finished" podID="0d33c0f9-aaab-4564-b0b1-590ff7356f4e" containerID="9a652076a279cfd6ef336ab1ef072fd15cb5ef044db42634bebc4e302b7e537b" exitCode=0 Mar 19 10:12:03 crc kubenswrapper[4835]: I0319 10:12:03.289547 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" event={"ID":"0d33c0f9-aaab-4564-b0b1-590ff7356f4e","Type":"ContainerDied","Data":"9a652076a279cfd6ef336ab1ef072fd15cb5ef044db42634bebc4e302b7e537b"} Mar 19 10:12:04 crc kubenswrapper[4835]: I0319 10:12:04.677593 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:04 crc kubenswrapper[4835]: I0319 10:12:04.789594 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmrgb\" (UniqueName: \"kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb\") pod \"0d33c0f9-aaab-4564-b0b1-590ff7356f4e\" (UID: \"0d33c0f9-aaab-4564-b0b1-590ff7356f4e\") " Mar 19 10:12:04 crc kubenswrapper[4835]: I0319 10:12:04.796294 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb" (OuterVolumeSpecName: "kube-api-access-lmrgb") pod "0d33c0f9-aaab-4564-b0b1-590ff7356f4e" (UID: "0d33c0f9-aaab-4564-b0b1-590ff7356f4e"). InnerVolumeSpecName "kube-api-access-lmrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:12:04 crc kubenswrapper[4835]: I0319 10:12:04.892983 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmrgb\" (UniqueName: \"kubernetes.io/projected/0d33c0f9-aaab-4564-b0b1-590ff7356f4e-kube-api-access-lmrgb\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:05 crc kubenswrapper[4835]: I0319 10:12:05.313649 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" event={"ID":"0d33c0f9-aaab-4564-b0b1-590ff7356f4e","Type":"ContainerDied","Data":"2cbf5a2f06d8cda78ff66e83f91c85f667ceac7912f4fc859e9b7cbc36e6b0fd"} Mar 19 10:12:05 crc kubenswrapper[4835]: I0319 10:12:05.313698 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cbf5a2f06d8cda78ff66e83f91c85f667ceac7912f4fc859e9b7cbc36e6b0fd" Mar 19 10:12:05 crc kubenswrapper[4835]: I0319 10:12:05.313778 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565252-jnz9j" Mar 19 10:12:05 crc kubenswrapper[4835]: I0319 10:12:05.763951 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565246-sw968"] Mar 19 10:12:05 crc kubenswrapper[4835]: I0319 10:12:05.775205 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565246-sw968"] Mar 19 10:12:06 crc kubenswrapper[4835]: I0319 10:12:06.417421 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba8c157-3eaa-4cc4-8e66-9cdab565c653" path="/var/lib/kubelet/pods/1ba8c157-3eaa-4cc4-8e66-9cdab565c653/volumes" Mar 19 10:12:06 crc kubenswrapper[4835]: I0319 10:12:06.421850 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:12:06 crc kubenswrapper[4835]: I0319 10:12:06.422060 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.275631 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:11 crc kubenswrapper[4835]: E0319 10:12:11.276600 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d33c0f9-aaab-4564-b0b1-590ff7356f4e" containerName="oc" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.276612 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d33c0f9-aaab-4564-b0b1-590ff7356f4e" containerName="oc" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.276867 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d33c0f9-aaab-4564-b0b1-590ff7356f4e" containerName="oc" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.278802 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.309115 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.358388 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spxfw\" (UniqueName: \"kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.358656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.358845 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.459846 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.460310 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.460340 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spxfw\" (UniqueName: \"kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.460372 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.461015 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.484715 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spxfw\" (UniqueName: \"kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw\") pod \"redhat-marketplace-x2tsn\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:11 crc kubenswrapper[4835]: I0319 10:12:11.610040 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:12 crc kubenswrapper[4835]: I0319 10:12:12.106666 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:12 crc kubenswrapper[4835]: I0319 10:12:12.380622 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerStarted","Data":"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52"} Mar 19 10:12:12 crc kubenswrapper[4835]: I0319 10:12:12.380885 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerStarted","Data":"39793baf530ce3612751ea7c27a6274b072ba790e69dfaa167b9fe765b0f263e"} Mar 19 10:12:13 crc kubenswrapper[4835]: I0319 10:12:13.395774 4835 generic.go:334] "Generic (PLEG): container finished" podID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerID="dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52" exitCode=0 Mar 19 10:12:13 crc kubenswrapper[4835]: I0319 10:12:13.395874 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerDied","Data":"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52"} Mar 19 10:12:13 crc kubenswrapper[4835]: I0319 10:12:13.396179 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerStarted","Data":"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905"} Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.407437 4835 generic.go:334] "Generic (PLEG): container finished" podID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerID="b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905" exitCode=0 Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.417213 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerDied","Data":"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905"} Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.481934 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6mgg7"] Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.485278 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.498840 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mgg7"] Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.541116 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-utilities\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.541177 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4v2m\" (UniqueName: \"kubernetes.io/projected/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-kube-api-access-q4v2m\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.541507 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-catalog-content\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.644220 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-utilities\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.644281 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4v2m\" (UniqueName: \"kubernetes.io/projected/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-kube-api-access-q4v2m\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.644380 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-catalog-content\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.645084 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-catalog-content\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.645126 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-utilities\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.662557 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4v2m\" (UniqueName: \"kubernetes.io/projected/e8f17f81-d3ac-4c40-b346-c3eac9cc70d2-kube-api-access-q4v2m\") pod \"community-operators-6mgg7\" (UID: \"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2\") " pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:14 crc kubenswrapper[4835]: I0319 10:12:14.815122 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:15 crc kubenswrapper[4835]: I0319 10:12:15.657140 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mgg7"] Mar 19 10:12:16 crc kubenswrapper[4835]: I0319 10:12:16.431248 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerStarted","Data":"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b"} Mar 19 10:12:16 crc kubenswrapper[4835]: I0319 10:12:16.433489 4835 generic.go:334] "Generic (PLEG): container finished" podID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerID="6107856139d49db459a57814ff93f344a97da0da3bf3b28e03925d08e4ce4b6c" exitCode=0 Mar 19 10:12:16 crc kubenswrapper[4835]: I0319 10:12:16.433522 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerDied","Data":"6107856139d49db459a57814ff93f344a97da0da3bf3b28e03925d08e4ce4b6c"} Mar 19 10:12:16 crc kubenswrapper[4835]: I0319 10:12:16.433543 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerStarted","Data":"768162cd5b1101ecd1dcdb932f5bb6143a0aac230e39d10e4713fe1e5eba2e1b"} Mar 19 10:12:16 crc kubenswrapper[4835]: I0319 10:12:16.468355 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2tsn" podStartSLOduration=2.26074018 podStartE2EDuration="5.468336514s" podCreationTimestamp="2026-03-19 10:12:11 +0000 UTC" firstStartedPulling="2026-03-19 10:12:12.382855652 +0000 UTC m=+2987.231454239" lastFinishedPulling="2026-03-19 10:12:15.590451986 +0000 UTC m=+2990.439050573" observedRunningTime="2026-03-19 10:12:16.459544347 +0000 UTC m=+2991.308142934" watchObservedRunningTime="2026-03-19 10:12:16.468336514 +0000 UTC m=+2991.316935101" Mar 19 10:12:21 crc kubenswrapper[4835]: I0319 10:12:21.610315 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:21 crc kubenswrapper[4835]: I0319 10:12:21.611135 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:21 crc kubenswrapper[4835]: I0319 10:12:21.673734 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:22 crc kubenswrapper[4835]: I0319 10:12:22.518994 4835 generic.go:334] "Generic (PLEG): container finished" podID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerID="4ee81425764876e6d9864153776ac5dcb15b1c51a680814dfb018e448562bc08" exitCode=0 Mar 19 10:12:22 crc kubenswrapper[4835]: I0319 10:12:22.519113 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerDied","Data":"4ee81425764876e6d9864153776ac5dcb15b1c51a680814dfb018e448562bc08"} Mar 19 10:12:22 crc kubenswrapper[4835]: I0319 10:12:22.573534 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:23 crc kubenswrapper[4835]: I0319 10:12:23.473384 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:24 crc kubenswrapper[4835]: I0319 10:12:24.541077 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2tsn" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="registry-server" containerID="cri-o://d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b" gracePeriod=2 Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.116770 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.233208 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities\") pod \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.233336 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content\") pod \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.233596 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spxfw\" (UniqueName: \"kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw\") pod \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\" (UID: \"1db004d2-6ed5-49ec-b9ba-e65560c27e1b\") " Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.234141 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities" (OuterVolumeSpecName: "utilities") pod "1db004d2-6ed5-49ec-b9ba-e65560c27e1b" (UID: "1db004d2-6ed5-49ec-b9ba-e65560c27e1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.234397 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.249734 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw" (OuterVolumeSpecName: "kube-api-access-spxfw") pod "1db004d2-6ed5-49ec-b9ba-e65560c27e1b" (UID: "1db004d2-6ed5-49ec-b9ba-e65560c27e1b"). InnerVolumeSpecName "kube-api-access-spxfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.257617 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1db004d2-6ed5-49ec-b9ba-e65560c27e1b" (UID: "1db004d2-6ed5-49ec-b9ba-e65560c27e1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.337428 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spxfw\" (UniqueName: \"kubernetes.io/projected/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-kube-api-access-spxfw\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.338459 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db004d2-6ed5-49ec-b9ba-e65560c27e1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.554401 4835 generic.go:334] "Generic (PLEG): container finished" podID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerID="d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b" exitCode=0 Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.554573 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerDied","Data":"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b"} Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.555786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2tsn" event={"ID":"1db004d2-6ed5-49ec-b9ba-e65560c27e1b","Type":"ContainerDied","Data":"39793baf530ce3612751ea7c27a6274b072ba790e69dfaa167b9fe765b0f263e"} Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.554642 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2tsn" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.555913 4835 scope.go:117] "RemoveContainer" containerID="d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.581094 4835 scope.go:117] "RemoveContainer" containerID="b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.603388 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.613883 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2tsn"] Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.620388 4835 scope.go:117] "RemoveContainer" containerID="dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.654578 4835 scope.go:117] "RemoveContainer" containerID="d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b" Mar 19 10:12:25 crc kubenswrapper[4835]: E0319 10:12:25.655223 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b\": container with ID starting with d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b not found: ID does not exist" containerID="d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.655364 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b"} err="failed to get container status \"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b\": rpc error: code = NotFound desc = could not find container \"d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b\": container with ID starting with d71c877691aa38ad68398f8800a2a5d2c9373257098a39b038407cd841e8c98b not found: ID does not exist" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.655463 4835 scope.go:117] "RemoveContainer" containerID="b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905" Mar 19 10:12:25 crc kubenswrapper[4835]: E0319 10:12:25.656128 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905\": container with ID starting with b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905 not found: ID does not exist" containerID="b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.656181 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905"} err="failed to get container status \"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905\": rpc error: code = NotFound desc = could not find container \"b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905\": container with ID starting with b2fe4995be9af1c1dd8722fcd5dad6bce80b857ea335e2c2c684b85240592905 not found: ID does not exist" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.656210 4835 scope.go:117] "RemoveContainer" containerID="dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52" Mar 19 10:12:25 crc kubenswrapper[4835]: E0319 10:12:25.656539 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52\": container with ID starting with dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52 not found: ID does not exist" containerID="dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52" Mar 19 10:12:25 crc kubenswrapper[4835]: I0319 10:12:25.656567 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52"} err="failed to get container status \"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52\": rpc error: code = NotFound desc = could not find container \"dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52\": container with ID starting with dd6db63dcba07aa172e0e9cda1a897d717a1f128eb0675ff71f7710ef6603c52 not found: ID does not exist" Mar 19 10:12:26 crc kubenswrapper[4835]: I0319 10:12:26.418447 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" path="/var/lib/kubelet/pods/1db004d2-6ed5-49ec-b9ba-e65560c27e1b/volumes" Mar 19 10:12:27 crc kubenswrapper[4835]: I0319 10:12:27.586427 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerStarted","Data":"b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc"} Mar 19 10:12:27 crc kubenswrapper[4835]: I0319 10:12:27.608206 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6mgg7" podStartSLOduration=2.979537614 podStartE2EDuration="13.608187159s" podCreationTimestamp="2026-03-19 10:12:14 +0000 UTC" firstStartedPulling="2026-03-19 10:12:16.439138427 +0000 UTC m=+2991.287737014" lastFinishedPulling="2026-03-19 10:12:27.067787972 +0000 UTC m=+3001.916386559" observedRunningTime="2026-03-19 10:12:27.604926651 +0000 UTC m=+3002.453525248" watchObservedRunningTime="2026-03-19 10:12:27.608187159 +0000 UTC m=+3002.456785746" Mar 19 10:12:34 crc kubenswrapper[4835]: I0319 10:12:34.816829 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:34 crc kubenswrapper[4835]: I0319 10:12:34.817449 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:34 crc kubenswrapper[4835]: I0319 10:12:34.878577 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:35 crc kubenswrapper[4835]: I0319 10:12:35.730217 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:12:35 crc kubenswrapper[4835]: I0319 10:12:35.821802 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mgg7"] Mar 19 10:12:35 crc kubenswrapper[4835]: I0319 10:12:35.880432 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 10:12:35 crc kubenswrapper[4835]: I0319 10:12:35.880730 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rnqsm" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="registry-server" containerID="cri-o://0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" gracePeriod=2 Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.057650 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 is running failed: container process not found" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.058086 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 is running failed: container process not found" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.058337 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 is running failed: container process not found" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.058370 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-rnqsm" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="registry-server" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.421698 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.422118 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.422160 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.423129 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.423182 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" gracePeriod=600 Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.450713 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.550237 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.566844 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g9xg\" (UniqueName: \"kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg\") pod \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.566905 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities\") pod \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.566940 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content\") pod \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\" (UID: \"82216f27-30b5-4c41-a5d1-a0523d8c2aca\") " Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.568962 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities" (OuterVolumeSpecName: "utilities") pod "82216f27-30b5-4c41-a5d1-a0523d8c2aca" (UID: "82216f27-30b5-4c41-a5d1-a0523d8c2aca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.574162 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg" (OuterVolumeSpecName: "kube-api-access-6g9xg") pod "82216f27-30b5-4c41-a5d1-a0523d8c2aca" (UID: "82216f27-30b5-4c41-a5d1-a0523d8c2aca"). InnerVolumeSpecName "kube-api-access-6g9xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.575105 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf367e5_fedd_4d9e_a7af_345df1f08353.slice/crio-7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.643552 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82216f27-30b5-4c41-a5d1-a0523d8c2aca" (UID: "82216f27-30b5-4c41-a5d1-a0523d8c2aca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.670153 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g9xg\" (UniqueName: \"kubernetes.io/projected/82216f27-30b5-4c41-a5d1-a0523d8c2aca-kube-api-access-6g9xg\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.670191 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.670200 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82216f27-30b5-4c41-a5d1-a0523d8c2aca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.689448 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" exitCode=0 Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.689488 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7"} Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.689567 4835 scope.go:117] "RemoveContainer" containerID="79a0595f92818bd7cbb6ee721d4dfb55e83bf1ecd1c54ba0b51c369cda200970" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.690407 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.690871 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.692951 4835 generic.go:334] "Generic (PLEG): container finished" podID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" exitCode=0 Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.694112 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rnqsm" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.704813 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerDied","Data":"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583"} Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.704865 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rnqsm" event={"ID":"82216f27-30b5-4c41-a5d1-a0523d8c2aca","Type":"ContainerDied","Data":"19f63dcfc145b659c0b098fb09f56e395a69cb54dfb1f084c8deeca4b8a3cd83"} Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.751107 4835 scope.go:117] "RemoveContainer" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.785908 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.803093 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rnqsm"] Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.811182 4835 scope.go:117] "RemoveContainer" containerID="3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.850279 4835 scope.go:117] "RemoveContainer" containerID="cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.883048 4835 scope.go:117] "RemoveContainer" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.888234 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583\": container with ID starting with 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 not found: ID does not exist" containerID="0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.888478 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583"} err="failed to get container status \"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583\": rpc error: code = NotFound desc = could not find container \"0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583\": container with ID starting with 0886f7dc4ea2254dbebd6736657514c46c982a4d425a2531307ed7dbc1f58583 not found: ID does not exist" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.888609 4835 scope.go:117] "RemoveContainer" containerID="3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.891458 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305\": container with ID starting with 3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305 not found: ID does not exist" containerID="3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.891497 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305"} err="failed to get container status \"3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305\": rpc error: code = NotFound desc = could not find container \"3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305\": container with ID starting with 3c5bd2e1ba833732a14c2e84582c5ef446ae8e3acc86efa499abafe1785b2305 not found: ID does not exist" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.891523 4835 scope.go:117] "RemoveContainer" containerID="cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869" Mar 19 10:12:36 crc kubenswrapper[4835]: E0319 10:12:36.894658 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869\": container with ID starting with cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869 not found: ID does not exist" containerID="cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869" Mar 19 10:12:36 crc kubenswrapper[4835]: I0319 10:12:36.894708 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869"} err="failed to get container status \"cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869\": rpc error: code = NotFound desc = could not find container \"cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869\": container with ID starting with cf982179eca343c4c0a5c80253cdb2b7f5708ea0d3ab41f847bddd9062dc6869 not found: ID does not exist" Mar 19 10:12:38 crc kubenswrapper[4835]: I0319 10:12:38.416331 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" path="/var/lib/kubelet/pods/82216f27-30b5-4c41-a5d1-a0523d8c2aca/volumes" Mar 19 10:12:47 crc kubenswrapper[4835]: I0319 10:12:47.843952 4835 generic.go:334] "Generic (PLEG): container finished" podID="f507ac6e-c9ef-4e19-a67c-f5358fe950b2" containerID="2964e3268e33f98812223c79483154857b413dee21039b57c95a323fdb37dff8" exitCode=0 Mar 19 10:12:47 crc kubenswrapper[4835]: I0319 10:12:47.844046 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" event={"ID":"f507ac6e-c9ef-4e19-a67c-f5358fe950b2","Type":"ContainerDied","Data":"2964e3268e33f98812223c79483154857b413dee21039b57c95a323fdb37dff8"} Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.456011 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.495669 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496008 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496072 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496098 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8qpn\" (UniqueName: \"kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496145 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496175 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.496251 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory\") pod \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\" (UID: \"f507ac6e-c9ef-4e19-a67c-f5358fe950b2\") " Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.515200 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn" (OuterVolumeSpecName: "kube-api-access-r8qpn") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "kube-api-access-r8qpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.523044 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.554201 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.554782 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory" (OuterVolumeSpecName: "inventory") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.556993 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.566335 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.575886 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "f507ac6e-c9ef-4e19-a67c-f5358fe950b2" (UID: "f507ac6e-c9ef-4e19-a67c-f5358fe950b2"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.598984 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599049 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599064 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8qpn\" (UniqueName: \"kubernetes.io/projected/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-kube-api-access-r8qpn\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599076 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599091 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599104 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.599117 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/f507ac6e-c9ef-4e19-a67c-f5358fe950b2-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.868612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" event={"ID":"f507ac6e-c9ef-4e19-a67c-f5358fe950b2","Type":"ContainerDied","Data":"3b9b5c3e48c5935cc8b32cd9154befc7bb10bc88f398e4972cb20524533eabd8"} Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.869016 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9b5c3e48c5935cc8b32cd9154befc7bb10bc88f398e4972cb20524533eabd8" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.868674 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.966683 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc"] Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967504 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f507ac6e-c9ef-4e19-a67c-f5358fe950b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967541 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f507ac6e-c9ef-4e19-a67c-f5358fe950b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967575 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967586 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967607 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967618 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967643 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="extract-content" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967654 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="extract-content" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967670 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="extract-utilities" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967683 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="extract-utilities" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967710 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="extract-utilities" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967721 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="extract-utilities" Mar 19 10:12:49 crc kubenswrapper[4835]: E0319 10:12:49.967770 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="extract-content" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.967783 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="extract-content" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.968237 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db004d2-6ed5-49ec-b9ba-e65560c27e1b" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.968304 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="82216f27-30b5-4c41-a5d1-a0523d8c2aca" containerName="registry-server" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.968328 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f507ac6e-c9ef-4e19-a67c-f5358fe950b2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.969497 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.972468 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.972533 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.972761 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.975391 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.976611 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:12:49 crc kubenswrapper[4835]: I0319 10:12:49.982442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc"] Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.110707 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.110964 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.111014 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.111210 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.111667 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdb5v\" (UniqueName: \"kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.111878 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.111927 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.213506 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdb5v\" (UniqueName: \"kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.213573 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.213619 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.214907 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.215064 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.215139 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.215437 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.222605 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.222835 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.225658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.225731 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.226499 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.227880 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.230433 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdb5v\" (UniqueName: \"kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.288358 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.405797 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:12:50 crc kubenswrapper[4835]: E0319 10:12:50.406813 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.861894 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc"] Mar 19 10:12:50 crc kubenswrapper[4835]: I0319 10:12:50.881666 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" event={"ID":"df6bbc12-578a-4c6e-a199-06baaf1ac6b8","Type":"ContainerStarted","Data":"47bcfcc50db79a290aa24f0f54133822cd79b3e35e89aa57c2f1ccedd5c5e55c"} Mar 19 10:12:52 crc kubenswrapper[4835]: I0319 10:12:52.904423 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" event={"ID":"df6bbc12-578a-4c6e-a199-06baaf1ac6b8","Type":"ContainerStarted","Data":"a98501c60fa84d1d23a54cea91436d5535ffd9fe38bc80e8b6dd2e90af64c526"} Mar 19 10:12:52 crc kubenswrapper[4835]: I0319 10:12:52.926948 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" podStartSLOduration=3.148693037 podStartE2EDuration="3.926928341s" podCreationTimestamp="2026-03-19 10:12:49 +0000 UTC" firstStartedPulling="2026-03-19 10:12:50.86504614 +0000 UTC m=+3025.713644727" lastFinishedPulling="2026-03-19 10:12:51.643281444 +0000 UTC m=+3026.491880031" observedRunningTime="2026-03-19 10:12:52.920438627 +0000 UTC m=+3027.769037214" watchObservedRunningTime="2026-03-19 10:12:52.926928341 +0000 UTC m=+3027.775526928" Mar 19 10:12:57 crc kubenswrapper[4835]: I0319 10:12:57.072576 4835 scope.go:117] "RemoveContainer" containerID="29ba2cc3838173af5b229dd30378b14bcaaea6655f7a112b2dedf56ab8dc3711" Mar 19 10:13:03 crc kubenswrapper[4835]: I0319 10:13:03.402598 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:13:03 crc kubenswrapper[4835]: E0319 10:13:03.403521 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:13:14 crc kubenswrapper[4835]: I0319 10:13:14.403106 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:13:14 crc kubenswrapper[4835]: E0319 10:13:14.404040 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:13:26 crc kubenswrapper[4835]: I0319 10:13:26.409680 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:13:26 crc kubenswrapper[4835]: E0319 10:13:26.410897 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:13:41 crc kubenswrapper[4835]: I0319 10:13:41.402812 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:13:41 crc kubenswrapper[4835]: E0319 10:13:41.403524 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:13:54 crc kubenswrapper[4835]: I0319 10:13:54.405380 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:13:54 crc kubenswrapper[4835]: E0319 10:13:54.406305 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.143959 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565254-58j5z"] Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.146664 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.149679 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.149824 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.150076 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.158938 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565254-58j5z"] Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.221730 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b287\" (UniqueName: \"kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287\") pod \"auto-csr-approver-29565254-58j5z\" (UID: \"2da1b929-dd4b-44ca-8485-c895d36b5c9e\") " pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.323461 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b287\" (UniqueName: \"kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287\") pod \"auto-csr-approver-29565254-58j5z\" (UID: \"2da1b929-dd4b-44ca-8485-c895d36b5c9e\") " pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.344871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b287\" (UniqueName: \"kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287\") pod \"auto-csr-approver-29565254-58j5z\" (UID: \"2da1b929-dd4b-44ca-8485-c895d36b5c9e\") " pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:00 crc kubenswrapper[4835]: I0319 10:14:00.468332 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:01 crc kubenswrapper[4835]: I0319 10:14:01.014361 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565254-58j5z"] Mar 19 10:14:01 crc kubenswrapper[4835]: I0319 10:14:01.685186 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565254-58j5z" event={"ID":"2da1b929-dd4b-44ca-8485-c895d36b5c9e","Type":"ContainerStarted","Data":"c806f6be65d7ad7ac8e267e8e2f87cab09aa630f9ebde2150788ca1f74b671aa"} Mar 19 10:14:03 crc kubenswrapper[4835]: I0319 10:14:03.709046 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565254-58j5z" event={"ID":"2da1b929-dd4b-44ca-8485-c895d36b5c9e","Type":"ContainerStarted","Data":"84297e854009523d8f43b674b5021567985ef089dbccdfb962d38758a0ef7c4a"} Mar 19 10:14:03 crc kubenswrapper[4835]: I0319 10:14:03.725319 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565254-58j5z" podStartSLOduration=1.601854464 podStartE2EDuration="3.725301103s" podCreationTimestamp="2026-03-19 10:14:00 +0000 UTC" firstStartedPulling="2026-03-19 10:14:01.016337502 +0000 UTC m=+3095.864936089" lastFinishedPulling="2026-03-19 10:14:03.139784141 +0000 UTC m=+3097.988382728" observedRunningTime="2026-03-19 10:14:03.723551936 +0000 UTC m=+3098.572150523" watchObservedRunningTime="2026-03-19 10:14:03.725301103 +0000 UTC m=+3098.573899690" Mar 19 10:14:04 crc kubenswrapper[4835]: I0319 10:14:04.721861 4835 generic.go:334] "Generic (PLEG): container finished" podID="2da1b929-dd4b-44ca-8485-c895d36b5c9e" containerID="84297e854009523d8f43b674b5021567985ef089dbccdfb962d38758a0ef7c4a" exitCode=0 Mar 19 10:14:04 crc kubenswrapper[4835]: I0319 10:14:04.722014 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565254-58j5z" event={"ID":"2da1b929-dd4b-44ca-8485-c895d36b5c9e","Type":"ContainerDied","Data":"84297e854009523d8f43b674b5021567985ef089dbccdfb962d38758a0ef7c4a"} Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.149830 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.180899 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b287\" (UniqueName: \"kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287\") pod \"2da1b929-dd4b-44ca-8485-c895d36b5c9e\" (UID: \"2da1b929-dd4b-44ca-8485-c895d36b5c9e\") " Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.187071 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287" (OuterVolumeSpecName: "kube-api-access-4b287") pod "2da1b929-dd4b-44ca-8485-c895d36b5c9e" (UID: "2da1b929-dd4b-44ca-8485-c895d36b5c9e"). InnerVolumeSpecName "kube-api-access-4b287". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.285832 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b287\" (UniqueName: \"kubernetes.io/projected/2da1b929-dd4b-44ca-8485-c895d36b5c9e-kube-api-access-4b287\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.410682 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:14:06 crc kubenswrapper[4835]: E0319 10:14:06.411283 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.746596 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565254-58j5z" event={"ID":"2da1b929-dd4b-44ca-8485-c895d36b5c9e","Type":"ContainerDied","Data":"c806f6be65d7ad7ac8e267e8e2f87cab09aa630f9ebde2150788ca1f74b671aa"} Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.747011 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c806f6be65d7ad7ac8e267e8e2f87cab09aa630f9ebde2150788ca1f74b671aa" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.746768 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565254-58j5z" Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.803270 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565248-rgkz9"] Mar 19 10:14:06 crc kubenswrapper[4835]: I0319 10:14:06.814910 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565248-rgkz9"] Mar 19 10:14:08 crc kubenswrapper[4835]: I0319 10:14:08.423503 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec4c69ae-8beb-4db1-b676-1dd6701aa6d2" path="/var/lib/kubelet/pods/ec4c69ae-8beb-4db1-b676-1dd6701aa6d2/volumes" Mar 19 10:14:17 crc kubenswrapper[4835]: I0319 10:14:17.403183 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:14:17 crc kubenswrapper[4835]: E0319 10:14:17.404145 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:30 crc kubenswrapper[4835]: I0319 10:14:30.402151 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:14:30 crc kubenswrapper[4835]: E0319 10:14:30.403201 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:36 crc kubenswrapper[4835]: I0319 10:14:36.079309 4835 generic.go:334] "Generic (PLEG): container finished" podID="df6bbc12-578a-4c6e-a199-06baaf1ac6b8" containerID="a98501c60fa84d1d23a54cea91436d5535ffd9fe38bc80e8b6dd2e90af64c526" exitCode=0 Mar 19 10:14:36 crc kubenswrapper[4835]: I0319 10:14:36.079410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" event={"ID":"df6bbc12-578a-4c6e-a199-06baaf1ac6b8","Type":"ContainerDied","Data":"a98501c60fa84d1d23a54cea91436d5535ffd9fe38bc80e8b6dd2e90af64c526"} Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.643861 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.720904 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.721897 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.721937 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.721982 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdb5v\" (UniqueName: \"kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.722112 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.722156 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.722179 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2\") pod \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\" (UID: \"df6bbc12-578a-4c6e-a199-06baaf1ac6b8\") " Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.740853 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v" (OuterVolumeSpecName: "kube-api-access-wdb5v") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "kube-api-access-wdb5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.751923 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.755416 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.757987 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.769424 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.777415 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory" (OuterVolumeSpecName: "inventory") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.785613 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "df6bbc12-578a-4c6e-a199-06baaf1ac6b8" (UID: "df6bbc12-578a-4c6e-a199-06baaf1ac6b8"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825854 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825891 4835 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825902 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825912 4835 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825921 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825930 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:37 crc kubenswrapper[4835]: I0319 10:14:37.825938 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdb5v\" (UniqueName: \"kubernetes.io/projected/df6bbc12-578a-4c6e-a199-06baaf1ac6b8-kube-api-access-wdb5v\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.103443 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" event={"ID":"df6bbc12-578a-4c6e-a199-06baaf1ac6b8","Type":"ContainerDied","Data":"47bcfcc50db79a290aa24f0f54133822cd79b3e35e89aa57c2f1ccedd5c5e55c"} Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.103488 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47bcfcc50db79a290aa24f0f54133822cd79b3e35e89aa57c2f1ccedd5c5e55c" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.103522 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.207670 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6"] Mar 19 10:14:38 crc kubenswrapper[4835]: E0319 10:14:38.208235 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6bbc12-578a-4c6e-a199-06baaf1ac6b8" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.208349 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6bbc12-578a-4c6e-a199-06baaf1ac6b8" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 10:14:38 crc kubenswrapper[4835]: E0319 10:14:38.208402 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da1b929-dd4b-44ca-8485-c895d36b5c9e" containerName="oc" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.208411 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da1b929-dd4b-44ca-8485-c895d36b5c9e" containerName="oc" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.213768 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da1b929-dd4b-44ca-8485-c895d36b5c9e" containerName="oc" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.213804 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6bbc12-578a-4c6e-a199-06baaf1ac6b8" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.214668 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.217042 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.217441 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.221484 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.221562 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.221609 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-ldz2g" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.231353 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6"] Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.235370 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.236172 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbgpl\" (UniqueName: \"kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.236563 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.236768 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.236947 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.339585 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.339706 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.339729 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.339937 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.340110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbgpl\" (UniqueName: \"kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.344630 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.344665 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.346506 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.348382 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.364776 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbgpl\" (UniqueName: \"kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl\") pod \"logging-edpm-deployment-openstack-edpm-ipam-hp9k6\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:38 crc kubenswrapper[4835]: I0319 10:14:38.546264 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:39 crc kubenswrapper[4835]: I0319 10:14:39.112316 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6"] Mar 19 10:14:39 crc kubenswrapper[4835]: I0319 10:14:39.123074 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:14:40 crc kubenswrapper[4835]: I0319 10:14:40.125726 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" event={"ID":"7e28c8e5-047a-4ea3-8887-7fbca62a9104","Type":"ContainerStarted","Data":"3e5318cafc8dd2e268b14b72d094b60383c2e40de281f3c4d4331aa758d54da3"} Mar 19 10:14:40 crc kubenswrapper[4835]: I0319 10:14:40.125969 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" event={"ID":"7e28c8e5-047a-4ea3-8887-7fbca62a9104","Type":"ContainerStarted","Data":"b4c97ada4deca0c0a2fa8044e961f44fc3046be0b414ddc2066cf19751ea7cb2"} Mar 19 10:14:40 crc kubenswrapper[4835]: I0319 10:14:40.147329 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" podStartSLOduration=1.59624452 podStartE2EDuration="2.147307096s" podCreationTimestamp="2026-03-19 10:14:38 +0000 UTC" firstStartedPulling="2026-03-19 10:14:39.122837979 +0000 UTC m=+3133.971436566" lastFinishedPulling="2026-03-19 10:14:39.673900555 +0000 UTC m=+3134.522499142" observedRunningTime="2026-03-19 10:14:40.140537506 +0000 UTC m=+3134.989136113" watchObservedRunningTime="2026-03-19 10:14:40.147307096 +0000 UTC m=+3134.995905683" Mar 19 10:14:41 crc kubenswrapper[4835]: I0319 10:14:41.402365 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:14:41 crc kubenswrapper[4835]: E0319 10:14:41.403342 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:52 crc kubenswrapper[4835]: I0319 10:14:52.403568 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:14:52 crc kubenswrapper[4835]: E0319 10:14:52.405127 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:14:54 crc kubenswrapper[4835]: I0319 10:14:54.310191 4835 generic.go:334] "Generic (PLEG): container finished" podID="7e28c8e5-047a-4ea3-8887-7fbca62a9104" containerID="3e5318cafc8dd2e268b14b72d094b60383c2e40de281f3c4d4331aa758d54da3" exitCode=0 Mar 19 10:14:54 crc kubenswrapper[4835]: I0319 10:14:54.310383 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" event={"ID":"7e28c8e5-047a-4ea3-8887-7fbca62a9104","Type":"ContainerDied","Data":"3e5318cafc8dd2e268b14b72d094b60383c2e40de281f3c4d4331aa758d54da3"} Mar 19 10:14:55 crc kubenswrapper[4835]: I0319 10:14:55.904829 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.027727 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1\") pod \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.028550 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbgpl\" (UniqueName: \"kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl\") pod \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.028583 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam\") pod \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.028979 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0\") pod \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.029067 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory\") pod \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\" (UID: \"7e28c8e5-047a-4ea3-8887-7fbca62a9104\") " Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.038064 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl" (OuterVolumeSpecName: "kube-api-access-nbgpl") pod "7e28c8e5-047a-4ea3-8887-7fbca62a9104" (UID: "7e28c8e5-047a-4ea3-8887-7fbca62a9104"). InnerVolumeSpecName "kube-api-access-nbgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.075576 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory" (OuterVolumeSpecName: "inventory") pod "7e28c8e5-047a-4ea3-8887-7fbca62a9104" (UID: "7e28c8e5-047a-4ea3-8887-7fbca62a9104"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.089599 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "7e28c8e5-047a-4ea3-8887-7fbca62a9104" (UID: "7e28c8e5-047a-4ea3-8887-7fbca62a9104"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.092354 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e28c8e5-047a-4ea3-8887-7fbca62a9104" (UID: "7e28c8e5-047a-4ea3-8887-7fbca62a9104"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.110177 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "7e28c8e5-047a-4ea3-8887-7fbca62a9104" (UID: "7e28c8e5-047a-4ea3-8887-7fbca62a9104"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.136093 4835 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.136134 4835 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.136147 4835 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.136157 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbgpl\" (UniqueName: \"kubernetes.io/projected/7e28c8e5-047a-4ea3-8887-7fbca62a9104-kube-api-access-nbgpl\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.136167 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e28c8e5-047a-4ea3-8887-7fbca62a9104-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.335344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" event={"ID":"7e28c8e5-047a-4ea3-8887-7fbca62a9104","Type":"ContainerDied","Data":"b4c97ada4deca0c0a2fa8044e961f44fc3046be0b414ddc2066cf19751ea7cb2"} Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.335412 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4c97ada4deca0c0a2fa8044e961f44fc3046be0b414ddc2066cf19751ea7cb2" Mar 19 10:14:56 crc kubenswrapper[4835]: I0319 10:14:56.335421 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-hp9k6" Mar 19 10:14:57 crc kubenswrapper[4835]: I0319 10:14:57.194187 4835 scope.go:117] "RemoveContainer" containerID="36815eb4e86923a5b03874c63d9f97eff61679746796609fd2e6d059968c3d87" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.159357 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442"] Mar 19 10:15:00 crc kubenswrapper[4835]: E0319 10:15:00.161140 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e28c8e5-047a-4ea3-8887-7fbca62a9104" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.161160 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e28c8e5-047a-4ea3-8887-7fbca62a9104" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.162558 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e28c8e5-047a-4ea3-8887-7fbca62a9104" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.164077 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.167052 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.167244 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.176549 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442"] Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.254874 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrcf\" (UniqueName: \"kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.255044 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.255137 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.358175 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrcf\" (UniqueName: \"kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.358329 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.358421 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.360767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.366677 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.384977 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrcf\" (UniqueName: \"kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf\") pod \"collect-profiles-29565255-8v442\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:00 crc kubenswrapper[4835]: I0319 10:15:00.493299 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:01 crc kubenswrapper[4835]: I0319 10:15:01.022715 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442"] Mar 19 10:15:01 crc kubenswrapper[4835]: I0319 10:15:01.411591 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" event={"ID":"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6","Type":"ContainerStarted","Data":"887c71a0c50755c1c2c5bc9ae0e6a404a3f50cc6c8492f90ccd210dc9a8865a7"} Mar 19 10:15:01 crc kubenswrapper[4835]: I0319 10:15:01.411923 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" event={"ID":"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6","Type":"ContainerStarted","Data":"b95396af935e8b70632ec477044356687207b663cc96fcaa511b236f85b43633"} Mar 19 10:15:01 crc kubenswrapper[4835]: I0319 10:15:01.441531 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" podStartSLOduration=1.441509307 podStartE2EDuration="1.441509307s" podCreationTimestamp="2026-03-19 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:15:01.434455427 +0000 UTC m=+3156.283054024" watchObservedRunningTime="2026-03-19 10:15:01.441509307 +0000 UTC m=+3156.290107894" Mar 19 10:15:02 crc kubenswrapper[4835]: I0319 10:15:02.422895 4835 generic.go:334] "Generic (PLEG): container finished" podID="2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" containerID="887c71a0c50755c1c2c5bc9ae0e6a404a3f50cc6c8492f90ccd210dc9a8865a7" exitCode=0 Mar 19 10:15:02 crc kubenswrapper[4835]: I0319 10:15:02.422928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" event={"ID":"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6","Type":"ContainerDied","Data":"887c71a0c50755c1c2c5bc9ae0e6a404a3f50cc6c8492f90ccd210dc9a8865a7"} Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.403112 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:15:03 crc kubenswrapper[4835]: E0319 10:15:03.403391 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.827750 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.950256 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume\") pod \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.950618 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrrcf\" (UniqueName: \"kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf\") pod \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.950718 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume\") pod \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\" (UID: \"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6\") " Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.951144 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" (UID: "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.951965 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.959466 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" (UID: "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:15:03 crc kubenswrapper[4835]: I0319 10:15:03.959466 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf" (OuterVolumeSpecName: "kube-api-access-hrrcf") pod "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" (UID: "2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6"). InnerVolumeSpecName "kube-api-access-hrrcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.055539 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrrcf\" (UniqueName: \"kubernetes.io/projected/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-kube-api-access-hrrcf\") on node \"crc\" DevicePath \"\"" Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.055612 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.452541 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" event={"ID":"2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6","Type":"ContainerDied","Data":"b95396af935e8b70632ec477044356687207b663cc96fcaa511b236f85b43633"} Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.453179 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b95396af935e8b70632ec477044356687207b663cc96fcaa511b236f85b43633" Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.452669 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442" Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.518163 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4"] Mar 19 10:15:04 crc kubenswrapper[4835]: I0319 10:15:04.528272 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565210-v9lr4"] Mar 19 10:15:06 crc kubenswrapper[4835]: I0319 10:15:06.417232 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4572ae2-65b7-406a-9113-9ac2447a00e5" path="/var/lib/kubelet/pods/b4572ae2-65b7-406a-9113-9ac2447a00e5/volumes" Mar 19 10:15:17 crc kubenswrapper[4835]: I0319 10:15:17.402585 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:15:17 crc kubenswrapper[4835]: E0319 10:15:17.403535 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:15:30 crc kubenswrapper[4835]: I0319 10:15:30.403813 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:15:30 crc kubenswrapper[4835]: E0319 10:15:30.404961 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:15:41 crc kubenswrapper[4835]: E0319 10:15:41.628605 4835 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.116:39134->38.129.56.116:37913: write tcp 38.129.56.116:39134->38.129.56.116:37913: write: broken pipe Mar 19 10:15:43 crc kubenswrapper[4835]: I0319 10:15:43.403534 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:15:43 crc kubenswrapper[4835]: E0319 10:15:43.404086 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:15:56 crc kubenswrapper[4835]: I0319 10:15:56.409621 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:15:56 crc kubenswrapper[4835]: E0319 10:15:56.410489 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:15:57 crc kubenswrapper[4835]: I0319 10:15:57.301812 4835 scope.go:117] "RemoveContainer" containerID="2cf8203a278fd2fddce79a3dd6f22a01a7a8e460a75e0649b883480f9b12088f" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.150137 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565256-vsjg5"] Mar 19 10:16:00 crc kubenswrapper[4835]: E0319 10:16:00.151564 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" containerName="collect-profiles" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.151598 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" containerName="collect-profiles" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.151914 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" containerName="collect-profiles" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.152997 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.160957 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.161115 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.161187 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.161420 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565256-vsjg5"] Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.281604 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv56w\" (UniqueName: \"kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w\") pod \"auto-csr-approver-29565256-vsjg5\" (UID: \"4e90d8d2-fb84-434a-9386-e527e339364e\") " pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.383831 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv56w\" (UniqueName: \"kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w\") pod \"auto-csr-approver-29565256-vsjg5\" (UID: \"4e90d8d2-fb84-434a-9386-e527e339364e\") " pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.402298 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv56w\" (UniqueName: \"kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w\") pod \"auto-csr-approver-29565256-vsjg5\" (UID: \"4e90d8d2-fb84-434a-9386-e527e339364e\") " pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:00 crc kubenswrapper[4835]: I0319 10:16:00.513039 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:01 crc kubenswrapper[4835]: I0319 10:16:01.011097 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565256-vsjg5"] Mar 19 10:16:01 crc kubenswrapper[4835]: I0319 10:16:01.172059 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" event={"ID":"4e90d8d2-fb84-434a-9386-e527e339364e","Type":"ContainerStarted","Data":"dc69caa56e174d3a51794ca8075d1662b218c19b1aa3d39d2e5f859849d93c77"} Mar 19 10:16:03 crc kubenswrapper[4835]: I0319 10:16:03.222544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" event={"ID":"4e90d8d2-fb84-434a-9386-e527e339364e","Type":"ContainerStarted","Data":"ef8af7b352d8576ebadc7473b141cb5ba77c31fb98ff534885e24cde8b4240e9"} Mar 19 10:16:03 crc kubenswrapper[4835]: I0319 10:16:03.252873 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" podStartSLOduration=1.792966237 podStartE2EDuration="3.252851065s" podCreationTimestamp="2026-03-19 10:16:00 +0000 UTC" firstStartedPulling="2026-03-19 10:16:00.998605665 +0000 UTC m=+3215.847204252" lastFinishedPulling="2026-03-19 10:16:02.458490493 +0000 UTC m=+3217.307089080" observedRunningTime="2026-03-19 10:16:03.2491311 +0000 UTC m=+3218.097729687" watchObservedRunningTime="2026-03-19 10:16:03.252851065 +0000 UTC m=+3218.101449652" Mar 19 10:16:04 crc kubenswrapper[4835]: I0319 10:16:04.235180 4835 generic.go:334] "Generic (PLEG): container finished" podID="4e90d8d2-fb84-434a-9386-e527e339364e" containerID="ef8af7b352d8576ebadc7473b141cb5ba77c31fb98ff534885e24cde8b4240e9" exitCode=0 Mar 19 10:16:04 crc kubenswrapper[4835]: I0319 10:16:04.235639 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" event={"ID":"4e90d8d2-fb84-434a-9386-e527e339364e","Type":"ContainerDied","Data":"ef8af7b352d8576ebadc7473b141cb5ba77c31fb98ff534885e24cde8b4240e9"} Mar 19 10:16:05 crc kubenswrapper[4835]: I0319 10:16:05.651222 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:05 crc kubenswrapper[4835]: I0319 10:16:05.765382 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv56w\" (UniqueName: \"kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w\") pod \"4e90d8d2-fb84-434a-9386-e527e339364e\" (UID: \"4e90d8d2-fb84-434a-9386-e527e339364e\") " Mar 19 10:16:05 crc kubenswrapper[4835]: I0319 10:16:05.776856 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w" (OuterVolumeSpecName: "kube-api-access-dv56w") pod "4e90d8d2-fb84-434a-9386-e527e339364e" (UID: "4e90d8d2-fb84-434a-9386-e527e339364e"). InnerVolumeSpecName "kube-api-access-dv56w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:16:05 crc kubenswrapper[4835]: I0319 10:16:05.869361 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv56w\" (UniqueName: \"kubernetes.io/projected/4e90d8d2-fb84-434a-9386-e527e339364e-kube-api-access-dv56w\") on node \"crc\" DevicePath \"\"" Mar 19 10:16:06 crc kubenswrapper[4835]: I0319 10:16:06.258554 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" event={"ID":"4e90d8d2-fb84-434a-9386-e527e339364e","Type":"ContainerDied","Data":"dc69caa56e174d3a51794ca8075d1662b218c19b1aa3d39d2e5f859849d93c77"} Mar 19 10:16:06 crc kubenswrapper[4835]: I0319 10:16:06.258937 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc69caa56e174d3a51794ca8075d1662b218c19b1aa3d39d2e5f859849d93c77" Mar 19 10:16:06 crc kubenswrapper[4835]: I0319 10:16:06.258620 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565256-vsjg5" Mar 19 10:16:06 crc kubenswrapper[4835]: I0319 10:16:06.769946 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565250-bdm26"] Mar 19 10:16:06 crc kubenswrapper[4835]: I0319 10:16:06.783219 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565250-bdm26"] Mar 19 10:16:08 crc kubenswrapper[4835]: I0319 10:16:08.417209 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4232fb3-0e4c-40a6-b0c4-ac32dea524d8" path="/var/lib/kubelet/pods/e4232fb3-0e4c-40a6-b0c4-ac32dea524d8/volumes" Mar 19 10:16:11 crc kubenswrapper[4835]: I0319 10:16:11.402021 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:16:11 crc kubenswrapper[4835]: E0319 10:16:11.403139 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:16:26 crc kubenswrapper[4835]: I0319 10:16:26.414507 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:16:26 crc kubenswrapper[4835]: E0319 10:16:26.415553 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:16:38 crc kubenswrapper[4835]: I0319 10:16:38.402431 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:16:38 crc kubenswrapper[4835]: E0319 10:16:38.403228 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:16:49 crc kubenswrapper[4835]: I0319 10:16:49.402833 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:16:49 crc kubenswrapper[4835]: E0319 10:16:49.403815 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:16:57 crc kubenswrapper[4835]: I0319 10:16:57.385346 4835 scope.go:117] "RemoveContainer" containerID="6a9c24f4c7e436ccc9d6bf45977fe9b7aeee71a0c249b317db620097ce733e76" Mar 19 10:17:04 crc kubenswrapper[4835]: I0319 10:17:04.403481 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:17:04 crc kubenswrapper[4835]: E0319 10:17:04.404283 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:17:16 crc kubenswrapper[4835]: I0319 10:17:16.410505 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:17:16 crc kubenswrapper[4835]: E0319 10:17:16.411350 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:17:27 crc kubenswrapper[4835]: I0319 10:17:27.403052 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:17:27 crc kubenswrapper[4835]: E0319 10:17:27.403875 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:17:41 crc kubenswrapper[4835]: I0319 10:17:41.402615 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:17:42 crc kubenswrapper[4835]: I0319 10:17:42.449387 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7"} Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.182976 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565258-r7xqx"] Mar 19 10:18:00 crc kubenswrapper[4835]: E0319 10:18:00.185177 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e90d8d2-fb84-434a-9386-e527e339364e" containerName="oc" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.185285 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e90d8d2-fb84-434a-9386-e527e339364e" containerName="oc" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.185735 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e90d8d2-fb84-434a-9386-e527e339364e" containerName="oc" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.186804 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.189892 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.190127 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.190394 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.197088 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565258-r7xqx"] Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.357148 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vsg\" (UniqueName: \"kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg\") pod \"auto-csr-approver-29565258-r7xqx\" (UID: \"eb6e4244-9452-4d83-b537-4ba82d156d89\") " pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.459989 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vsg\" (UniqueName: \"kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg\") pod \"auto-csr-approver-29565258-r7xqx\" (UID: \"eb6e4244-9452-4d83-b537-4ba82d156d89\") " pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.482400 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vsg\" (UniqueName: \"kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg\") pod \"auto-csr-approver-29565258-r7xqx\" (UID: \"eb6e4244-9452-4d83-b537-4ba82d156d89\") " pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:00 crc kubenswrapper[4835]: I0319 10:18:00.530273 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:01 crc kubenswrapper[4835]: I0319 10:18:01.117059 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565258-r7xqx"] Mar 19 10:18:01 crc kubenswrapper[4835]: I0319 10:18:01.699079 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" event={"ID":"eb6e4244-9452-4d83-b537-4ba82d156d89","Type":"ContainerStarted","Data":"7e7428a0220f35d48ea4252481e392a7c247ca6c0a46fae75f12fe23a963634b"} Mar 19 10:18:02 crc kubenswrapper[4835]: I0319 10:18:02.712237 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" event={"ID":"eb6e4244-9452-4d83-b537-4ba82d156d89","Type":"ContainerStarted","Data":"23b89b2a7495618d7d02381e9c58dd1e43abbc29f2ba7ab67696ba18e37ec81c"} Mar 19 10:18:02 crc kubenswrapper[4835]: I0319 10:18:02.747959 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" podStartSLOduration=1.752110321 podStartE2EDuration="2.74793381s" podCreationTimestamp="2026-03-19 10:18:00 +0000 UTC" firstStartedPulling="2026-03-19 10:18:01.126965122 +0000 UTC m=+3335.975563709" lastFinishedPulling="2026-03-19 10:18:02.122788611 +0000 UTC m=+3336.971387198" observedRunningTime="2026-03-19 10:18:02.729085189 +0000 UTC m=+3337.577683786" watchObservedRunningTime="2026-03-19 10:18:02.74793381 +0000 UTC m=+3337.596532457" Mar 19 10:18:03 crc kubenswrapper[4835]: I0319 10:18:03.726057 4835 generic.go:334] "Generic (PLEG): container finished" podID="eb6e4244-9452-4d83-b537-4ba82d156d89" containerID="23b89b2a7495618d7d02381e9c58dd1e43abbc29f2ba7ab67696ba18e37ec81c" exitCode=0 Mar 19 10:18:03 crc kubenswrapper[4835]: I0319 10:18:03.726170 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" event={"ID":"eb6e4244-9452-4d83-b537-4ba82d156d89","Type":"ContainerDied","Data":"23b89b2a7495618d7d02381e9c58dd1e43abbc29f2ba7ab67696ba18e37ec81c"} Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.171211 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.340253 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8vsg\" (UniqueName: \"kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg\") pod \"eb6e4244-9452-4d83-b537-4ba82d156d89\" (UID: \"eb6e4244-9452-4d83-b537-4ba82d156d89\") " Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.348983 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg" (OuterVolumeSpecName: "kube-api-access-c8vsg") pod "eb6e4244-9452-4d83-b537-4ba82d156d89" (UID: "eb6e4244-9452-4d83-b537-4ba82d156d89"). InnerVolumeSpecName "kube-api-access-c8vsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.442925 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8vsg\" (UniqueName: \"kubernetes.io/projected/eb6e4244-9452-4d83-b537-4ba82d156d89-kube-api-access-c8vsg\") on node \"crc\" DevicePath \"\"" Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.753462 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" event={"ID":"eb6e4244-9452-4d83-b537-4ba82d156d89","Type":"ContainerDied","Data":"7e7428a0220f35d48ea4252481e392a7c247ca6c0a46fae75f12fe23a963634b"} Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.753507 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7428a0220f35d48ea4252481e392a7c247ca6c0a46fae75f12fe23a963634b" Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.753520 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565258-r7xqx" Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.815010 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565252-jnz9j"] Mar 19 10:18:05 crc kubenswrapper[4835]: I0319 10:18:05.831536 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565252-jnz9j"] Mar 19 10:18:06 crc kubenswrapper[4835]: I0319 10:18:06.418728 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d33c0f9-aaab-4564-b0b1-590ff7356f4e" path="/var/lib/kubelet/pods/0d33c0f9-aaab-4564-b0b1-590ff7356f4e/volumes" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.318595 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:27 crc kubenswrapper[4835]: E0319 10:18:27.319600 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6e4244-9452-4d83-b537-4ba82d156d89" containerName="oc" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.319613 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6e4244-9452-4d83-b537-4ba82d156d89" containerName="oc" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.319883 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6e4244-9452-4d83-b537-4ba82d156d89" containerName="oc" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.321728 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.343283 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.433038 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.433181 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2bt\" (UniqueName: \"kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.433325 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.535548 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2bt\" (UniqueName: \"kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.535670 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.535873 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.536380 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.536406 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.566461 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2bt\" (UniqueName: \"kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt\") pod \"certified-operators-vgsf6\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:27 crc kubenswrapper[4835]: I0319 10:18:27.647247 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:28 crc kubenswrapper[4835]: I0319 10:18:28.248954 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:29 crc kubenswrapper[4835]: I0319 10:18:29.046150 4835 generic.go:334] "Generic (PLEG): container finished" podID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerID="4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b" exitCode=0 Mar 19 10:18:29 crc kubenswrapper[4835]: I0319 10:18:29.046448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerDied","Data":"4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b"} Mar 19 10:18:29 crc kubenswrapper[4835]: I0319 10:18:29.046481 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerStarted","Data":"b2e2b94516e4e4bf4cf069fbef3f3ae9ab26af6fd0abf8b4048f927767f58f31"} Mar 19 10:18:31 crc kubenswrapper[4835]: I0319 10:18:31.071273 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerStarted","Data":"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412"} Mar 19 10:18:33 crc kubenswrapper[4835]: I0319 10:18:33.092907 4835 generic.go:334] "Generic (PLEG): container finished" podID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerID="c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412" exitCode=0 Mar 19 10:18:33 crc kubenswrapper[4835]: I0319 10:18:33.092956 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerDied","Data":"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412"} Mar 19 10:18:34 crc kubenswrapper[4835]: I0319 10:18:34.104161 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerStarted","Data":"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3"} Mar 19 10:18:34 crc kubenswrapper[4835]: I0319 10:18:34.129103 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vgsf6" podStartSLOduration=2.44977802 podStartE2EDuration="7.129083625s" podCreationTimestamp="2026-03-19 10:18:27 +0000 UTC" firstStartedPulling="2026-03-19 10:18:29.049888388 +0000 UTC m=+3363.898486975" lastFinishedPulling="2026-03-19 10:18:33.729193993 +0000 UTC m=+3368.577792580" observedRunningTime="2026-03-19 10:18:34.119631548 +0000 UTC m=+3368.968230155" watchObservedRunningTime="2026-03-19 10:18:34.129083625 +0000 UTC m=+3368.977682212" Mar 19 10:18:37 crc kubenswrapper[4835]: I0319 10:18:37.648971 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:37 crc kubenswrapper[4835]: I0319 10:18:37.649510 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:38 crc kubenswrapper[4835]: I0319 10:18:38.696754 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vgsf6" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="registry-server" probeResult="failure" output=< Mar 19 10:18:38 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:18:38 crc kubenswrapper[4835]: > Mar 19 10:18:47 crc kubenswrapper[4835]: I0319 10:18:47.713322 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:47 crc kubenswrapper[4835]: I0319 10:18:47.773815 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:47 crc kubenswrapper[4835]: I0319 10:18:47.966525 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:49 crc kubenswrapper[4835]: I0319 10:18:49.272012 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vgsf6" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="registry-server" containerID="cri-o://5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3" gracePeriod=2 Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.036718 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.096036 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2bt\" (UniqueName: \"kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt\") pod \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.096516 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities\") pod \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.096705 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content\") pod \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\" (UID: \"432a47e6-40d0-4d2e-a02c-b9d524a61c42\") " Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.098366 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities" (OuterVolumeSpecName: "utilities") pod "432a47e6-40d0-4d2e-a02c-b9d524a61c42" (UID: "432a47e6-40d0-4d2e-a02c-b9d524a61c42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.106730 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt" (OuterVolumeSpecName: "kube-api-access-qg2bt") pod "432a47e6-40d0-4d2e-a02c-b9d524a61c42" (UID: "432a47e6-40d0-4d2e-a02c-b9d524a61c42"). InnerVolumeSpecName "kube-api-access-qg2bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.162713 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "432a47e6-40d0-4d2e-a02c-b9d524a61c42" (UID: "432a47e6-40d0-4d2e-a02c-b9d524a61c42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.199174 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.199202 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/432a47e6-40d0-4d2e-a02c-b9d524a61c42-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.199216 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2bt\" (UniqueName: \"kubernetes.io/projected/432a47e6-40d0-4d2e-a02c-b9d524a61c42-kube-api-access-qg2bt\") on node \"crc\" DevicePath \"\"" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.283255 4835 generic.go:334] "Generic (PLEG): container finished" podID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerID="5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3" exitCode=0 Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.283301 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vgsf6" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.283302 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerDied","Data":"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3"} Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.284130 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vgsf6" event={"ID":"432a47e6-40d0-4d2e-a02c-b9d524a61c42","Type":"ContainerDied","Data":"b2e2b94516e4e4bf4cf069fbef3f3ae9ab26af6fd0abf8b4048f927767f58f31"} Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.284150 4835 scope.go:117] "RemoveContainer" containerID="5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.311227 4835 scope.go:117] "RemoveContainer" containerID="c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.327949 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.339487 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vgsf6"] Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.348555 4835 scope.go:117] "RemoveContainer" containerID="4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.408092 4835 scope.go:117] "RemoveContainer" containerID="5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3" Mar 19 10:18:50 crc kubenswrapper[4835]: E0319 10:18:50.408361 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3\": container with ID starting with 5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3 not found: ID does not exist" containerID="5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.408389 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3"} err="failed to get container status \"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3\": rpc error: code = NotFound desc = could not find container \"5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3\": container with ID starting with 5eae628a7cd89d38dd100429adb500580f89f13e618b6f999de48a8590fcdfd3 not found: ID does not exist" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.408410 4835 scope.go:117] "RemoveContainer" containerID="c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412" Mar 19 10:18:50 crc kubenswrapper[4835]: E0319 10:18:50.408576 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412\": container with ID starting with c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412 not found: ID does not exist" containerID="c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.408597 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412"} err="failed to get container status \"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412\": rpc error: code = NotFound desc = could not find container \"c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412\": container with ID starting with c2063002beba3f9339471412e947f6917658bf0b6cbeb71e356b9b02d6007412 not found: ID does not exist" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.408610 4835 scope.go:117] "RemoveContainer" containerID="4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b" Mar 19 10:18:50 crc kubenswrapper[4835]: E0319 10:18:50.410369 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b\": container with ID starting with 4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b not found: ID does not exist" containerID="4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.410401 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b"} err="failed to get container status \"4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b\": rpc error: code = NotFound desc = could not find container \"4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b\": container with ID starting with 4a23c8f02f7ddb636b2d9ec357f8baf8223c1481f6bad0e2e8329111fd94a79b not found: ID does not exist" Mar 19 10:18:50 crc kubenswrapper[4835]: I0319 10:18:50.416271 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" path="/var/lib/kubelet/pods/432a47e6-40d0-4d2e-a02c-b9d524a61c42/volumes" Mar 19 10:18:57 crc kubenswrapper[4835]: I0319 10:18:57.519068 4835 scope.go:117] "RemoveContainer" containerID="9a652076a279cfd6ef336ab1ef072fd15cb5ef044db42634bebc4e302b7e537b" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.344264 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:19:30 crc kubenswrapper[4835]: E0319 10:19:30.345484 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="extract-content" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.345501 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="extract-content" Mar 19 10:19:30 crc kubenswrapper[4835]: E0319 10:19:30.345530 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="registry-server" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.345539 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="registry-server" Mar 19 10:19:30 crc kubenswrapper[4835]: E0319 10:19:30.345577 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="extract-utilities" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.345587 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="extract-utilities" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.345930 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="432a47e6-40d0-4d2e-a02c-b9d524a61c42" containerName="registry-server" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.348155 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.358691 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.447051 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.447623 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27hb\" (UniqueName: \"kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.447855 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.549436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k27hb\" (UniqueName: \"kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.549526 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.549601 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.550021 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.550453 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.575023 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27hb\" (UniqueName: \"kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb\") pod \"redhat-operators-pvlsr\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:30 crc kubenswrapper[4835]: I0319 10:19:30.679160 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:31 crc kubenswrapper[4835]: I0319 10:19:31.195921 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:19:32 crc kubenswrapper[4835]: I0319 10:19:32.080448 4835 generic.go:334] "Generic (PLEG): container finished" podID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerID="e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b" exitCode=0 Mar 19 10:19:32 crc kubenswrapper[4835]: I0319 10:19:32.080592 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerDied","Data":"e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b"} Mar 19 10:19:32 crc kubenswrapper[4835]: I0319 10:19:32.081078 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerStarted","Data":"350b64244e993fda25250ce2eacf8baa0d71009a8628dc7ebe2ca84eee000950"} Mar 19 10:19:33 crc kubenswrapper[4835]: I0319 10:19:33.093964 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerStarted","Data":"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00"} Mar 19 10:19:39 crc kubenswrapper[4835]: I0319 10:19:39.164004 4835 generic.go:334] "Generic (PLEG): container finished" podID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerID="9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00" exitCode=0 Mar 19 10:19:39 crc kubenswrapper[4835]: I0319 10:19:39.164073 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerDied","Data":"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00"} Mar 19 10:19:39 crc kubenswrapper[4835]: I0319 10:19:39.167223 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:19:40 crc kubenswrapper[4835]: I0319 10:19:40.181287 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerStarted","Data":"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d"} Mar 19 10:19:40 crc kubenswrapper[4835]: I0319 10:19:40.216761 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pvlsr" podStartSLOduration=2.457884534 podStartE2EDuration="10.216722741s" podCreationTimestamp="2026-03-19 10:19:30 +0000 UTC" firstStartedPulling="2026-03-19 10:19:32.084381967 +0000 UTC m=+3426.932980554" lastFinishedPulling="2026-03-19 10:19:39.843220164 +0000 UTC m=+3434.691818761" observedRunningTime="2026-03-19 10:19:40.20045613 +0000 UTC m=+3435.049054717" watchObservedRunningTime="2026-03-19 10:19:40.216722741 +0000 UTC m=+3435.065321338" Mar 19 10:19:40 crc kubenswrapper[4835]: I0319 10:19:40.679667 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:40 crc kubenswrapper[4835]: I0319 10:19:40.679725 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:19:41 crc kubenswrapper[4835]: I0319 10:19:41.728453 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvlsr" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" probeResult="failure" output=< Mar 19 10:19:41 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:19:41 crc kubenswrapper[4835]: > Mar 19 10:19:51 crc kubenswrapper[4835]: I0319 10:19:51.737650 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvlsr" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" probeResult="failure" output=< Mar 19 10:19:51 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:19:51 crc kubenswrapper[4835]: > Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.145721 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565260-4zs5v"] Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.151889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.156165 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.156585 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.157769 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.160110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565260-4zs5v"] Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.239526 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bldmh\" (UniqueName: \"kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh\") pod \"auto-csr-approver-29565260-4zs5v\" (UID: \"f434d384-d6c2-4d66-9c3d-bc8eeb89a314\") " pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.343172 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bldmh\" (UniqueName: \"kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh\") pod \"auto-csr-approver-29565260-4zs5v\" (UID: \"f434d384-d6c2-4d66-9c3d-bc8eeb89a314\") " pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.364767 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bldmh\" (UniqueName: \"kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh\") pod \"auto-csr-approver-29565260-4zs5v\" (UID: \"f434d384-d6c2-4d66-9c3d-bc8eeb89a314\") " pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.474892 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:00 crc kubenswrapper[4835]: I0319 10:20:00.960847 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565260-4zs5v"] Mar 19 10:20:01 crc kubenswrapper[4835]: I0319 10:20:01.443355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" event={"ID":"f434d384-d6c2-4d66-9c3d-bc8eeb89a314","Type":"ContainerStarted","Data":"fd6380d5343dd72e1b8789afe608c6842080853bfb251d0006a904c2436daea2"} Mar 19 10:20:01 crc kubenswrapper[4835]: I0319 10:20:01.731492 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pvlsr" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" probeResult="failure" output=< Mar 19 10:20:01 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:20:01 crc kubenswrapper[4835]: > Mar 19 10:20:03 crc kubenswrapper[4835]: I0319 10:20:03.470703 4835 generic.go:334] "Generic (PLEG): container finished" podID="f434d384-d6c2-4d66-9c3d-bc8eeb89a314" containerID="31268b6fc5442bfb003fcc01f6204c4cf2a2ff6507d63233f4328dacd4649a69" exitCode=0 Mar 19 10:20:03 crc kubenswrapper[4835]: I0319 10:20:03.470773 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" event={"ID":"f434d384-d6c2-4d66-9c3d-bc8eeb89a314","Type":"ContainerDied","Data":"31268b6fc5442bfb003fcc01f6204c4cf2a2ff6507d63233f4328dacd4649a69"} Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.173251 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.206529 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bldmh\" (UniqueName: \"kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh\") pod \"f434d384-d6c2-4d66-9c3d-bc8eeb89a314\" (UID: \"f434d384-d6c2-4d66-9c3d-bc8eeb89a314\") " Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.213348 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh" (OuterVolumeSpecName: "kube-api-access-bldmh") pod "f434d384-d6c2-4d66-9c3d-bc8eeb89a314" (UID: "f434d384-d6c2-4d66-9c3d-bc8eeb89a314"). InnerVolumeSpecName "kube-api-access-bldmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.310091 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bldmh\" (UniqueName: \"kubernetes.io/projected/f434d384-d6c2-4d66-9c3d-bc8eeb89a314-kube-api-access-bldmh\") on node \"crc\" DevicePath \"\"" Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.514220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" event={"ID":"f434d384-d6c2-4d66-9c3d-bc8eeb89a314","Type":"ContainerDied","Data":"fd6380d5343dd72e1b8789afe608c6842080853bfb251d0006a904c2436daea2"} Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.514271 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6380d5343dd72e1b8789afe608c6842080853bfb251d0006a904c2436daea2" Mar 19 10:20:05 crc kubenswrapper[4835]: I0319 10:20:05.514347 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565260-4zs5v" Mar 19 10:20:06 crc kubenswrapper[4835]: I0319 10:20:06.256039 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565254-58j5z"] Mar 19 10:20:06 crc kubenswrapper[4835]: I0319 10:20:06.266336 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565254-58j5z"] Mar 19 10:20:06 crc kubenswrapper[4835]: I0319 10:20:06.416736 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da1b929-dd4b-44ca-8485-c895d36b5c9e" path="/var/lib/kubelet/pods/2da1b929-dd4b-44ca-8485-c895d36b5c9e/volumes" Mar 19 10:20:06 crc kubenswrapper[4835]: I0319 10:20:06.422392 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:20:06 crc kubenswrapper[4835]: I0319 10:20:06.422458 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:20:10 crc kubenswrapper[4835]: I0319 10:20:10.758625 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:20:10 crc kubenswrapper[4835]: I0319 10:20:10.817599 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:20:11 crc kubenswrapper[4835]: I0319 10:20:11.000336 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:20:12 crc kubenswrapper[4835]: I0319 10:20:12.584384 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pvlsr" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" containerID="cri-o://357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d" gracePeriod=2 Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.166366 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.237356 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content\") pod \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.237466 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k27hb\" (UniqueName: \"kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb\") pod \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.237617 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities\") pod \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\" (UID: \"49266950-f18f-41bc-bca0-3d0b60d1a9d8\") " Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.238309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities" (OuterVolumeSpecName: "utilities") pod "49266950-f18f-41bc-bca0-3d0b60d1a9d8" (UID: "49266950-f18f-41bc-bca0-3d0b60d1a9d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.251591 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb" (OuterVolumeSpecName: "kube-api-access-k27hb") pod "49266950-f18f-41bc-bca0-3d0b60d1a9d8" (UID: "49266950-f18f-41bc-bca0-3d0b60d1a9d8"). InnerVolumeSpecName "kube-api-access-k27hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.340403 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k27hb\" (UniqueName: \"kubernetes.io/projected/49266950-f18f-41bc-bca0-3d0b60d1a9d8-kube-api-access-k27hb\") on node \"crc\" DevicePath \"\"" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.340436 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.375435 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49266950-f18f-41bc-bca0-3d0b60d1a9d8" (UID: "49266950-f18f-41bc-bca0-3d0b60d1a9d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.443109 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49266950-f18f-41bc-bca0-3d0b60d1a9d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.600312 4835 generic.go:334] "Generic (PLEG): container finished" podID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerID="357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d" exitCode=0 Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.600737 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerDied","Data":"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d"} Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.600790 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pvlsr" event={"ID":"49266950-f18f-41bc-bca0-3d0b60d1a9d8","Type":"ContainerDied","Data":"350b64244e993fda25250ce2eacf8baa0d71009a8628dc7ebe2ca84eee000950"} Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.600812 4835 scope.go:117] "RemoveContainer" containerID="357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.600997 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pvlsr" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.643002 4835 scope.go:117] "RemoveContainer" containerID="9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.653265 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.689222 4835 scope.go:117] "RemoveContainer" containerID="e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.691290 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pvlsr"] Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.735995 4835 scope.go:117] "RemoveContainer" containerID="357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d" Mar 19 10:20:13 crc kubenswrapper[4835]: E0319 10:20:13.736787 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d\": container with ID starting with 357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d not found: ID does not exist" containerID="357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.736825 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d"} err="failed to get container status \"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d\": rpc error: code = NotFound desc = could not find container \"357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d\": container with ID starting with 357a70fdc1d3ca6fb5c23508ff6eb55598074398e6646bd0ea4923a45e41258d not found: ID does not exist" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.736876 4835 scope.go:117] "RemoveContainer" containerID="9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00" Mar 19 10:20:13 crc kubenswrapper[4835]: E0319 10:20:13.737354 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00\": container with ID starting with 9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00 not found: ID does not exist" containerID="9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.737398 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00"} err="failed to get container status \"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00\": rpc error: code = NotFound desc = could not find container \"9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00\": container with ID starting with 9a8a2a5445ba62fc4e64959222b8dd8ccfe3fa757bfcb0fc74d481b5e2151a00 not found: ID does not exist" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.737450 4835 scope.go:117] "RemoveContainer" containerID="e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b" Mar 19 10:20:13 crc kubenswrapper[4835]: E0319 10:20:13.737892 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b\": container with ID starting with e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b not found: ID does not exist" containerID="e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b" Mar 19 10:20:13 crc kubenswrapper[4835]: I0319 10:20:13.737947 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b"} err="failed to get container status \"e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b\": rpc error: code = NotFound desc = could not find container \"e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b\": container with ID starting with e63e6d79b2062da5cd56c7c04c2739bd902b939613ac91bc568cd86d22546f5b not found: ID does not exist" Mar 19 10:20:14 crc kubenswrapper[4835]: I0319 10:20:14.418919 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" path="/var/lib/kubelet/pods/49266950-f18f-41bc-bca0-3d0b60d1a9d8/volumes" Mar 19 10:20:36 crc kubenswrapper[4835]: I0319 10:20:36.421762 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:20:36 crc kubenswrapper[4835]: I0319 10:20:36.422269 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:20:57 crc kubenswrapper[4835]: I0319 10:20:57.673469 4835 scope.go:117] "RemoveContainer" containerID="84297e854009523d8f43b674b5021567985ef089dbccdfb962d38758a0ef7c4a" Mar 19 10:21:06 crc kubenswrapper[4835]: I0319 10:21:06.422732 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:21:06 crc kubenswrapper[4835]: I0319 10:21:06.424641 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:21:06 crc kubenswrapper[4835]: I0319 10:21:06.424857 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:21:06 crc kubenswrapper[4835]: I0319 10:21:06.426060 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:21:06 crc kubenswrapper[4835]: I0319 10:21:06.426241 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7" gracePeriod=600 Mar 19 10:21:07 crc kubenswrapper[4835]: I0319 10:21:07.204890 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7" exitCode=0 Mar 19 10:21:07 crc kubenswrapper[4835]: I0319 10:21:07.204954 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7"} Mar 19 10:21:07 crc kubenswrapper[4835]: I0319 10:21:07.205286 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a"} Mar 19 10:21:07 crc kubenswrapper[4835]: I0319 10:21:07.205309 4835 scope.go:117] "RemoveContainer" containerID="7b9042d97bd75eb24c603b3b25bfd42fbe319b3290c7851e15e79438ba4bccb7" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.147757 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565262-5x6sj"] Mar 19 10:22:00 crc kubenswrapper[4835]: E0319 10:22:00.148919 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.148933 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" Mar 19 10:22:00 crc kubenswrapper[4835]: E0319 10:22:00.148955 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f434d384-d6c2-4d66-9c3d-bc8eeb89a314" containerName="oc" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.148961 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f434d384-d6c2-4d66-9c3d-bc8eeb89a314" containerName="oc" Mar 19 10:22:00 crc kubenswrapper[4835]: E0319 10:22:00.148988 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="extract-content" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.148994 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="extract-content" Mar 19 10:22:00 crc kubenswrapper[4835]: E0319 10:22:00.149015 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="extract-utilities" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.149021 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="extract-utilities" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.149238 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f434d384-d6c2-4d66-9c3d-bc8eeb89a314" containerName="oc" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.149253 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="49266950-f18f-41bc-bca0-3d0b60d1a9d8" containerName="registry-server" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.150088 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.152388 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.152398 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.152492 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.160997 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565262-5x6sj"] Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.226201 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qwnf\" (UniqueName: \"kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf\") pod \"auto-csr-approver-29565262-5x6sj\" (UID: \"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10\") " pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.328963 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qwnf\" (UniqueName: \"kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf\") pod \"auto-csr-approver-29565262-5x6sj\" (UID: \"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10\") " pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.347945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qwnf\" (UniqueName: \"kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf\") pod \"auto-csr-approver-29565262-5x6sj\" (UID: \"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10\") " pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.475368 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:00 crc kubenswrapper[4835]: I0319 10:22:00.959557 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565262-5x6sj"] Mar 19 10:22:00 crc kubenswrapper[4835]: W0319 10:22:00.966542 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0680c3_1a8d_4fa8_ba5c_ff20fe44ea10.slice/crio-f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0 WatchSource:0}: Error finding container f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0: Status 404 returned error can't find the container with id f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0 Mar 19 10:22:01 crc kubenswrapper[4835]: I0319 10:22:01.869793 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" event={"ID":"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10","Type":"ContainerStarted","Data":"f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0"} Mar 19 10:22:02 crc kubenswrapper[4835]: I0319 10:22:02.892489 4835 generic.go:334] "Generic (PLEG): container finished" podID="5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" containerID="9ccd8cae4b26ecbb3573fe3ed3c0cfb375c95b4c29d7c3019265ad7c0c8623e2" exitCode=0 Mar 19 10:22:02 crc kubenswrapper[4835]: I0319 10:22:02.893112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" event={"ID":"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10","Type":"ContainerDied","Data":"9ccd8cae4b26ecbb3573fe3ed3c0cfb375c95b4c29d7c3019265ad7c0c8623e2"} Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.303395 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.338988 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qwnf\" (UniqueName: \"kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf\") pod \"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10\" (UID: \"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10\") " Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.346226 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf" (OuterVolumeSpecName: "kube-api-access-2qwnf") pod "5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" (UID: "5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10"). InnerVolumeSpecName "kube-api-access-2qwnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.441756 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qwnf\" (UniqueName: \"kubernetes.io/projected/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10-kube-api-access-2qwnf\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.914231 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" event={"ID":"5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10","Type":"ContainerDied","Data":"f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0"} Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.914613 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1af50b1f3d857514c4c220a2e13b3646f5b862745eb6527acb87a07da885ec0" Mar 19 10:22:04 crc kubenswrapper[4835]: I0319 10:22:04.914321 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565262-5x6sj" Mar 19 10:22:05 crc kubenswrapper[4835]: I0319 10:22:05.412084 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565256-vsjg5"] Mar 19 10:22:05 crc kubenswrapper[4835]: I0319 10:22:05.423572 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565256-vsjg5"] Mar 19 10:22:06 crc kubenswrapper[4835]: I0319 10:22:06.418258 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e90d8d2-fb84-434a-9386-e527e339364e" path="/var/lib/kubelet/pods/4e90d8d2-fb84-434a-9386-e527e339364e/volumes" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.570452 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:40 crc kubenswrapper[4835]: E0319 10:22:40.586541 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" containerName="oc" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.586566 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" containerName="oc" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.586883 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" containerName="oc" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.588667 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.588829 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.663120 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.663240 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bb8k\" (UniqueName: \"kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.663389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.766293 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bb8k\" (UniqueName: \"kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.766414 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.766594 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.766945 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.767094 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.786137 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bb8k\" (UniqueName: \"kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k\") pod \"redhat-marketplace-8l84d\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:40 crc kubenswrapper[4835]: I0319 10:22:40.919773 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:41 crc kubenswrapper[4835]: I0319 10:22:41.464076 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.366171 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.373062 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.386880 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.406716 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.406860 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.406981 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqgk2\" (UniqueName: \"kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.421058 4835 generic.go:334] "Generic (PLEG): container finished" podID="d120cb17-2764-49e1-a203-d33e32c87aee" containerID="f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621" exitCode=0 Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.421110 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerDied","Data":"f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621"} Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.421141 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerStarted","Data":"51273f0489776e45629020e87c28314d8f2098ac44e77477bce46ee636437390"} Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.509358 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqgk2\" (UniqueName: \"kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.510936 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.511226 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.511963 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.512604 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.532440 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqgk2\" (UniqueName: \"kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2\") pod \"community-operators-fcddz\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:42 crc kubenswrapper[4835]: I0319 10:22:42.695902 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:43 crc kubenswrapper[4835]: I0319 10:22:43.294603 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:43 crc kubenswrapper[4835]: I0319 10:22:43.433396 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerStarted","Data":"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433"} Mar 19 10:22:43 crc kubenswrapper[4835]: I0319 10:22:43.438814 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerStarted","Data":"a06fd94aef48e1f188690f1f60566bccdb8519e15a0fd556a7196add2f7a7920"} Mar 19 10:22:44 crc kubenswrapper[4835]: I0319 10:22:44.450836 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerID="ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb" exitCode=0 Mar 19 10:22:44 crc kubenswrapper[4835]: I0319 10:22:44.451190 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerDied","Data":"ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb"} Mar 19 10:22:44 crc kubenswrapper[4835]: I0319 10:22:44.454855 4835 generic.go:334] "Generic (PLEG): container finished" podID="d120cb17-2764-49e1-a203-d33e32c87aee" containerID="1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433" exitCode=0 Mar 19 10:22:44 crc kubenswrapper[4835]: I0319 10:22:44.454899 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerDied","Data":"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433"} Mar 19 10:22:45 crc kubenswrapper[4835]: I0319 10:22:45.466960 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerStarted","Data":"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd"} Mar 19 10:22:45 crc kubenswrapper[4835]: I0319 10:22:45.469376 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerStarted","Data":"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9"} Mar 19 10:22:45 crc kubenswrapper[4835]: I0319 10:22:45.504106 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8l84d" podStartSLOduration=2.884130411 podStartE2EDuration="5.504084933s" podCreationTimestamp="2026-03-19 10:22:40 +0000 UTC" firstStartedPulling="2026-03-19 10:22:42.422967658 +0000 UTC m=+3617.271566245" lastFinishedPulling="2026-03-19 10:22:45.04292219 +0000 UTC m=+3619.891520767" observedRunningTime="2026-03-19 10:22:45.502371577 +0000 UTC m=+3620.350970184" watchObservedRunningTime="2026-03-19 10:22:45.504084933 +0000 UTC m=+3620.352683530" Mar 19 10:22:47 crc kubenswrapper[4835]: I0319 10:22:47.491094 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerID="5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd" exitCode=0 Mar 19 10:22:47 crc kubenswrapper[4835]: I0319 10:22:47.491164 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerDied","Data":"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd"} Mar 19 10:22:48 crc kubenswrapper[4835]: I0319 10:22:48.510144 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerStarted","Data":"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace"} Mar 19 10:22:48 crc kubenswrapper[4835]: I0319 10:22:48.559736 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcddz" podStartSLOduration=2.921747216 podStartE2EDuration="6.559713548s" podCreationTimestamp="2026-03-19 10:22:42 +0000 UTC" firstStartedPulling="2026-03-19 10:22:44.45276723 +0000 UTC m=+3619.301365817" lastFinishedPulling="2026-03-19 10:22:48.090733562 +0000 UTC m=+3622.939332149" observedRunningTime="2026-03-19 10:22:48.54872367 +0000 UTC m=+3623.397322277" watchObservedRunningTime="2026-03-19 10:22:48.559713548 +0000 UTC m=+3623.408312145" Mar 19 10:22:50 crc kubenswrapper[4835]: I0319 10:22:50.920906 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:50 crc kubenswrapper[4835]: I0319 10:22:50.921431 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:50 crc kubenswrapper[4835]: I0319 10:22:50.980524 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:51 crc kubenswrapper[4835]: I0319 10:22:51.589326 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:52 crc kubenswrapper[4835]: I0319 10:22:52.560433 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:52 crc kubenswrapper[4835]: I0319 10:22:52.696628 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:52 crc kubenswrapper[4835]: I0319 10:22:52.696973 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:52 crc kubenswrapper[4835]: I0319 10:22:52.744265 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:53 crc kubenswrapper[4835]: I0319 10:22:53.565125 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8l84d" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="registry-server" containerID="cri-o://e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9" gracePeriod=2 Mar 19 10:22:53 crc kubenswrapper[4835]: I0319 10:22:53.626977 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.159907 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.236373 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities\") pod \"d120cb17-2764-49e1-a203-d33e32c87aee\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.236670 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bb8k\" (UniqueName: \"kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k\") pod \"d120cb17-2764-49e1-a203-d33e32c87aee\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.236841 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content\") pod \"d120cb17-2764-49e1-a203-d33e32c87aee\" (UID: \"d120cb17-2764-49e1-a203-d33e32c87aee\") " Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.237479 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities" (OuterVolumeSpecName: "utilities") pod "d120cb17-2764-49e1-a203-d33e32c87aee" (UID: "d120cb17-2764-49e1-a203-d33e32c87aee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.238089 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.245099 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k" (OuterVolumeSpecName: "kube-api-access-5bb8k") pod "d120cb17-2764-49e1-a203-d33e32c87aee" (UID: "d120cb17-2764-49e1-a203-d33e32c87aee"). InnerVolumeSpecName "kube-api-access-5bb8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.290002 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d120cb17-2764-49e1-a203-d33e32c87aee" (UID: "d120cb17-2764-49e1-a203-d33e32c87aee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.340916 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bb8k\" (UniqueName: \"kubernetes.io/projected/d120cb17-2764-49e1-a203-d33e32c87aee-kube-api-access-5bb8k\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.341126 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d120cb17-2764-49e1-a203-d33e32c87aee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.588993 4835 generic.go:334] "Generic (PLEG): container finished" podID="d120cb17-2764-49e1-a203-d33e32c87aee" containerID="e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9" exitCode=0 Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.589129 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerDied","Data":"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9"} Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.589178 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l84d" event={"ID":"d120cb17-2764-49e1-a203-d33e32c87aee","Type":"ContainerDied","Data":"51273f0489776e45629020e87c28314d8f2098ac44e77477bce46ee636437390"} Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.589213 4835 scope.go:117] "RemoveContainer" containerID="e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.589239 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l84d" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.621721 4835 scope.go:117] "RemoveContainer" containerID="1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.625515 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.637868 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l84d"] Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.649220 4835 scope.go:117] "RemoveContainer" containerID="f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.700004 4835 scope.go:117] "RemoveContainer" containerID="e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9" Mar 19 10:22:54 crc kubenswrapper[4835]: E0319 10:22:54.700927 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9\": container with ID starting with e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9 not found: ID does not exist" containerID="e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.700971 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9"} err="failed to get container status \"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9\": rpc error: code = NotFound desc = could not find container \"e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9\": container with ID starting with e5f339f8a49ba1ee9a6f0777aaf903c4488ad62efb19862312eeacf3d7f24ce9 not found: ID does not exist" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.701000 4835 scope.go:117] "RemoveContainer" containerID="1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433" Mar 19 10:22:54 crc kubenswrapper[4835]: E0319 10:22:54.701375 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433\": container with ID starting with 1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433 not found: ID does not exist" containerID="1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.701447 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433"} err="failed to get container status \"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433\": rpc error: code = NotFound desc = could not find container \"1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433\": container with ID starting with 1eae2a2fcd2be27fb5ea4c00dfc9c436dcc98b2707fc4f2f3b5b91e8f703d433 not found: ID does not exist" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.701482 4835 scope.go:117] "RemoveContainer" containerID="f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621" Mar 19 10:22:54 crc kubenswrapper[4835]: E0319 10:22:54.701880 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621\": container with ID starting with f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621 not found: ID does not exist" containerID="f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621" Mar 19 10:22:54 crc kubenswrapper[4835]: I0319 10:22:54.701905 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621"} err="failed to get container status \"f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621\": rpc error: code = NotFound desc = could not find container \"f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621\": container with ID starting with f8fd7ddd1c583fc50b02a24c99b386d27673abe893d58e4773396472dc804621 not found: ID does not exist" Mar 19 10:22:55 crc kubenswrapper[4835]: I0319 10:22:55.154873 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:55 crc kubenswrapper[4835]: I0319 10:22:55.609671 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcddz" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="registry-server" containerID="cri-o://4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace" gracePeriod=2 Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.149983 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.304405 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqgk2\" (UniqueName: \"kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2\") pod \"9ce75b5a-81a3-4910-bde3-fa298d09c564\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.304548 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities\") pod \"9ce75b5a-81a3-4910-bde3-fa298d09c564\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.304814 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content\") pod \"9ce75b5a-81a3-4910-bde3-fa298d09c564\" (UID: \"9ce75b5a-81a3-4910-bde3-fa298d09c564\") " Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.305840 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities" (OuterVolumeSpecName: "utilities") pod "9ce75b5a-81a3-4910-bde3-fa298d09c564" (UID: "9ce75b5a-81a3-4910-bde3-fa298d09c564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.306052 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.310290 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2" (OuterVolumeSpecName: "kube-api-access-gqgk2") pod "9ce75b5a-81a3-4910-bde3-fa298d09c564" (UID: "9ce75b5a-81a3-4910-bde3-fa298d09c564"). InnerVolumeSpecName "kube-api-access-gqgk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.359309 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ce75b5a-81a3-4910-bde3-fa298d09c564" (UID: "9ce75b5a-81a3-4910-bde3-fa298d09c564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.408098 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqgk2\" (UniqueName: \"kubernetes.io/projected/9ce75b5a-81a3-4910-bde3-fa298d09c564-kube-api-access-gqgk2\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.408299 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ce75b5a-81a3-4910-bde3-fa298d09c564-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.419591 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" path="/var/lib/kubelet/pods/d120cb17-2764-49e1-a203-d33e32c87aee/volumes" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.625793 4835 generic.go:334] "Generic (PLEG): container finished" podID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerID="4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace" exitCode=0 Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.625834 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerDied","Data":"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace"} Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.625860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcddz" event={"ID":"9ce75b5a-81a3-4910-bde3-fa298d09c564","Type":"ContainerDied","Data":"a06fd94aef48e1f188690f1f60566bccdb8519e15a0fd556a7196add2f7a7920"} Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.625880 4835 scope.go:117] "RemoveContainer" containerID="4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.625982 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcddz" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.656952 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.666517 4835 scope.go:117] "RemoveContainer" containerID="5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.668583 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcddz"] Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.686515 4835 scope.go:117] "RemoveContainer" containerID="ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.765139 4835 scope.go:117] "RemoveContainer" containerID="4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace" Mar 19 10:22:56 crc kubenswrapper[4835]: E0319 10:22:56.765602 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace\": container with ID starting with 4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace not found: ID does not exist" containerID="4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.765650 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace"} err="failed to get container status \"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace\": rpc error: code = NotFound desc = could not find container \"4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace\": container with ID starting with 4e191c4de886cb4e621a8ff52f94e5fb6b97403a2936963fcefe7e98bad5dace not found: ID does not exist" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.765676 4835 scope.go:117] "RemoveContainer" containerID="5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd" Mar 19 10:22:56 crc kubenswrapper[4835]: E0319 10:22:56.766137 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd\": container with ID starting with 5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd not found: ID does not exist" containerID="5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.766160 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd"} err="failed to get container status \"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd\": rpc error: code = NotFound desc = could not find container \"5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd\": container with ID starting with 5ff0a0fdbc0265d1d2c29b499ba8485786fec15844cdd2ee8fc4656b82c650bd not found: ID does not exist" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.766174 4835 scope.go:117] "RemoveContainer" containerID="ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb" Mar 19 10:22:56 crc kubenswrapper[4835]: E0319 10:22:56.766454 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb\": container with ID starting with ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb not found: ID does not exist" containerID="ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb" Mar 19 10:22:56 crc kubenswrapper[4835]: I0319 10:22:56.766484 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb"} err="failed to get container status \"ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb\": rpc error: code = NotFound desc = could not find container \"ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb\": container with ID starting with ed0aae4f07fc38c84869a778bd4fdba7e4f70b4c35b7da3085d5c25ad04459eb not found: ID does not exist" Mar 19 10:22:57 crc kubenswrapper[4835]: I0319 10:22:57.800335 4835 scope.go:117] "RemoveContainer" containerID="ef8af7b352d8576ebadc7473b141cb5ba77c31fb98ff534885e24cde8b4240e9" Mar 19 10:22:58 crc kubenswrapper[4835]: I0319 10:22:58.416590 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" path="/var/lib/kubelet/pods/9ce75b5a-81a3-4910-bde3-fa298d09c564/volumes" Mar 19 10:23:06 crc kubenswrapper[4835]: I0319 10:23:06.422348 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:23:06 crc kubenswrapper[4835]: I0319 10:23:06.422970 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:23:36 crc kubenswrapper[4835]: I0319 10:23:36.422164 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:23:36 crc kubenswrapper[4835]: I0319 10:23:36.422723 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.170773 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565264-n68qv"] Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171767 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="extract-utilities" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171781 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="extract-utilities" Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171798 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171804 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171815 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="extract-content" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171821 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="extract-content" Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171844 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="extract-utilities" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171850 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="extract-utilities" Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171871 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171877 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: E0319 10:24:00.171891 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="extract-content" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.171896 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="extract-content" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.172107 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d120cb17-2764-49e1-a203-d33e32c87aee" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.172123 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce75b5a-81a3-4910-bde3-fa298d09c564" containerName="registry-server" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.172999 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.175893 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.178935 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.179162 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.197395 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-n68qv"] Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.305558 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddx25\" (UniqueName: \"kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25\") pod \"auto-csr-approver-29565264-n68qv\" (UID: \"b08f067a-e10a-4eb1-a223-4d128d174b4e\") " pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.407698 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddx25\" (UniqueName: \"kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25\") pod \"auto-csr-approver-29565264-n68qv\" (UID: \"b08f067a-e10a-4eb1-a223-4d128d174b4e\") " pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.427286 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddx25\" (UniqueName: \"kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25\") pod \"auto-csr-approver-29565264-n68qv\" (UID: \"b08f067a-e10a-4eb1-a223-4d128d174b4e\") " pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:00 crc kubenswrapper[4835]: I0319 10:24:00.502492 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:01 crc kubenswrapper[4835]: I0319 10:24:01.044078 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-n68qv"] Mar 19 10:24:01 crc kubenswrapper[4835]: I0319 10:24:01.380421 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-n68qv" event={"ID":"b08f067a-e10a-4eb1-a223-4d128d174b4e","Type":"ContainerStarted","Data":"4324f283a3d8fdf876f85121c87978bdceeaf7cbd93b19d06077d8558585e71f"} Mar 19 10:24:03 crc kubenswrapper[4835]: I0319 10:24:03.405668 4835 generic.go:334] "Generic (PLEG): container finished" podID="b08f067a-e10a-4eb1-a223-4d128d174b4e" containerID="0356e29dbb5c43cf10a7d9540e7e98579f8ca48662ee3d934af5a4e646ffe0fe" exitCode=0 Mar 19 10:24:03 crc kubenswrapper[4835]: I0319 10:24:03.405788 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-n68qv" event={"ID":"b08f067a-e10a-4eb1-a223-4d128d174b4e","Type":"ContainerDied","Data":"0356e29dbb5c43cf10a7d9540e7e98579f8ca48662ee3d934af5a4e646ffe0fe"} Mar 19 10:24:04 crc kubenswrapper[4835]: I0319 10:24:04.893332 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.019181 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddx25\" (UniqueName: \"kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25\") pod \"b08f067a-e10a-4eb1-a223-4d128d174b4e\" (UID: \"b08f067a-e10a-4eb1-a223-4d128d174b4e\") " Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.026086 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25" (OuterVolumeSpecName: "kube-api-access-ddx25") pod "b08f067a-e10a-4eb1-a223-4d128d174b4e" (UID: "b08f067a-e10a-4eb1-a223-4d128d174b4e"). InnerVolumeSpecName "kube-api-access-ddx25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.123794 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddx25\" (UniqueName: \"kubernetes.io/projected/b08f067a-e10a-4eb1-a223-4d128d174b4e-kube-api-access-ddx25\") on node \"crc\" DevicePath \"\"" Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.427734 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-n68qv" event={"ID":"b08f067a-e10a-4eb1-a223-4d128d174b4e","Type":"ContainerDied","Data":"4324f283a3d8fdf876f85121c87978bdceeaf7cbd93b19d06077d8558585e71f"} Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.428046 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4324f283a3d8fdf876f85121c87978bdceeaf7cbd93b19d06077d8558585e71f" Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.427829 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-n68qv" Mar 19 10:24:05 crc kubenswrapper[4835]: I0319 10:24:05.996327 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565258-r7xqx"] Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.009210 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565258-r7xqx"] Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.428790 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.428839 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.431867 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6e4244-9452-4d83-b537-4ba82d156d89" path="/var/lib/kubelet/pods/eb6e4244-9452-4d83-b537-4ba82d156d89/volumes" Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.433234 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.436214 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:24:06 crc kubenswrapper[4835]: I0319 10:24:06.436352 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" gracePeriod=600 Mar 19 10:24:06 crc kubenswrapper[4835]: E0319 10:24:06.558137 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:24:07 crc kubenswrapper[4835]: I0319 10:24:07.456276 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" exitCode=0 Mar 19 10:24:07 crc kubenswrapper[4835]: I0319 10:24:07.456344 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a"} Mar 19 10:24:07 crc kubenswrapper[4835]: I0319 10:24:07.456393 4835 scope.go:117] "RemoveContainer" containerID="c517b2f16187a1eb29912009e0c7cd2697f65ca802ec095e400837a3df1ceee7" Mar 19 10:24:07 crc kubenswrapper[4835]: I0319 10:24:07.458377 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:24:07 crc kubenswrapper[4835]: E0319 10:24:07.459394 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:24:19 crc kubenswrapper[4835]: I0319 10:24:19.403776 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:24:19 crc kubenswrapper[4835]: E0319 10:24:19.405262 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:24:32 crc kubenswrapper[4835]: I0319 10:24:32.402479 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:24:32 crc kubenswrapper[4835]: E0319 10:24:32.403510 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:24:47 crc kubenswrapper[4835]: I0319 10:24:47.402803 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:24:47 crc kubenswrapper[4835]: E0319 10:24:47.403563 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:24:57 crc kubenswrapper[4835]: I0319 10:24:57.934096 4835 scope.go:117] "RemoveContainer" containerID="23b89b2a7495618d7d02381e9c58dd1e43abbc29f2ba7ab67696ba18e37ec81c" Mar 19 10:25:02 crc kubenswrapper[4835]: I0319 10:25:02.403833 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:25:02 crc kubenswrapper[4835]: E0319 10:25:02.404667 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:25:13 crc kubenswrapper[4835]: I0319 10:25:13.401774 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:25:13 crc kubenswrapper[4835]: E0319 10:25:13.402633 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:25:25 crc kubenswrapper[4835]: I0319 10:25:25.402525 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:25:25 crc kubenswrapper[4835]: E0319 10:25:25.404111 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:25:37 crc kubenswrapper[4835]: I0319 10:25:37.402517 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:25:37 crc kubenswrapper[4835]: E0319 10:25:37.403404 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:25:52 crc kubenswrapper[4835]: I0319 10:25:52.402951 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:25:52 crc kubenswrapper[4835]: E0319 10:25:52.403549 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.174652 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565266-z5z6r"] Mar 19 10:26:00 crc kubenswrapper[4835]: E0319 10:26:00.175723 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08f067a-e10a-4eb1-a223-4d128d174b4e" containerName="oc" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.175736 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08f067a-e10a-4eb1-a223-4d128d174b4e" containerName="oc" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.176000 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08f067a-e10a-4eb1-a223-4d128d174b4e" containerName="oc" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.176980 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.179623 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.179982 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.180070 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.211447 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-z5z6r"] Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.272147 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96k7l\" (UniqueName: \"kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l\") pod \"auto-csr-approver-29565266-z5z6r\" (UID: \"0b028d50-4e99-426b-9d69-371fa9e45c15\") " pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.374676 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96k7l\" (UniqueName: \"kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l\") pod \"auto-csr-approver-29565266-z5z6r\" (UID: \"0b028d50-4e99-426b-9d69-371fa9e45c15\") " pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.392505 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96k7l\" (UniqueName: \"kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l\") pod \"auto-csr-approver-29565266-z5z6r\" (UID: \"0b028d50-4e99-426b-9d69-371fa9e45c15\") " pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.502456 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.965186 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-z5z6r"] Mar 19 10:26:00 crc kubenswrapper[4835]: I0319 10:26:00.971776 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:26:01 crc kubenswrapper[4835]: I0319 10:26:01.798710 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" event={"ID":"0b028d50-4e99-426b-9d69-371fa9e45c15","Type":"ContainerStarted","Data":"3f9fa44e9b6d105f80781988c2616924acf9f3cb86184af774fbe8ced7bb7220"} Mar 19 10:26:02 crc kubenswrapper[4835]: I0319 10:26:02.818967 4835 generic.go:334] "Generic (PLEG): container finished" podID="0b028d50-4e99-426b-9d69-371fa9e45c15" containerID="e57e3290e79a8759c3ff4f2094d9b0a239dc0631797205fd72ab1b9722fb50db" exitCode=0 Mar 19 10:26:02 crc kubenswrapper[4835]: I0319 10:26:02.819083 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" event={"ID":"0b028d50-4e99-426b-9d69-371fa9e45c15","Type":"ContainerDied","Data":"e57e3290e79a8759c3ff4f2094d9b0a239dc0631797205fd72ab1b9722fb50db"} Mar 19 10:26:03 crc kubenswrapper[4835]: I0319 10:26:03.402716 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:26:03 crc kubenswrapper[4835]: E0319 10:26:03.403257 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.343995 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.373785 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96k7l\" (UniqueName: \"kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l\") pod \"0b028d50-4e99-426b-9d69-371fa9e45c15\" (UID: \"0b028d50-4e99-426b-9d69-371fa9e45c15\") " Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.385104 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l" (OuterVolumeSpecName: "kube-api-access-96k7l") pod "0b028d50-4e99-426b-9d69-371fa9e45c15" (UID: "0b028d50-4e99-426b-9d69-371fa9e45c15"). InnerVolumeSpecName "kube-api-access-96k7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.477033 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96k7l\" (UniqueName: \"kubernetes.io/projected/0b028d50-4e99-426b-9d69-371fa9e45c15-kube-api-access-96k7l\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.842497 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" event={"ID":"0b028d50-4e99-426b-9d69-371fa9e45c15","Type":"ContainerDied","Data":"3f9fa44e9b6d105f80781988c2616924acf9f3cb86184af774fbe8ced7bb7220"} Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.842534 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f9fa44e9b6d105f80781988c2616924acf9f3cb86184af774fbe8ced7bb7220" Mar 19 10:26:04 crc kubenswrapper[4835]: I0319 10:26:04.842631 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-z5z6r" Mar 19 10:26:05 crc kubenswrapper[4835]: I0319 10:26:05.420326 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565260-4zs5v"] Mar 19 10:26:05 crc kubenswrapper[4835]: I0319 10:26:05.431121 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565260-4zs5v"] Mar 19 10:26:06 crc kubenswrapper[4835]: I0319 10:26:06.419910 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f434d384-d6c2-4d66-9c3d-bc8eeb89a314" path="/var/lib/kubelet/pods/f434d384-d6c2-4d66-9c3d-bc8eeb89a314/volumes" Mar 19 10:26:18 crc kubenswrapper[4835]: I0319 10:26:18.405920 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:26:18 crc kubenswrapper[4835]: E0319 10:26:18.407490 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:26:33 crc kubenswrapper[4835]: I0319 10:26:33.402710 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:26:33 crc kubenswrapper[4835]: E0319 10:26:33.403621 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:26:44 crc kubenswrapper[4835]: I0319 10:26:44.402122 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:26:44 crc kubenswrapper[4835]: E0319 10:26:44.403174 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:26:49 crc kubenswrapper[4835]: E0319 10:26:49.479796 4835 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.116:35706->38.129.56.116:37913: read tcp 38.129.56.116:35706->38.129.56.116:37913: read: connection reset by peer Mar 19 10:26:58 crc kubenswrapper[4835]: I0319 10:26:58.054853 4835 scope.go:117] "RemoveContainer" containerID="31268b6fc5442bfb003fcc01f6204c4cf2a2ff6507d63233f4328dacd4649a69" Mar 19 10:26:58 crc kubenswrapper[4835]: I0319 10:26:58.403342 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:26:58 crc kubenswrapper[4835]: E0319 10:26:58.403938 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:27:09 crc kubenswrapper[4835]: I0319 10:27:09.404904 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:27:09 crc kubenswrapper[4835]: E0319 10:27:09.406288 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:27:20 crc kubenswrapper[4835]: I0319 10:27:20.403321 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:27:20 crc kubenswrapper[4835]: E0319 10:27:20.404713 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:27:34 crc kubenswrapper[4835]: I0319 10:27:34.402247 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:27:34 crc kubenswrapper[4835]: E0319 10:27:34.403089 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:27:45 crc kubenswrapper[4835]: I0319 10:27:45.403115 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:27:45 crc kubenswrapper[4835]: E0319 10:27:45.404175 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:27:57 crc kubenswrapper[4835]: I0319 10:27:57.402531 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:27:57 crc kubenswrapper[4835]: E0319 10:27:57.403507 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.152375 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565268-gnmxv"] Mar 19 10:28:00 crc kubenswrapper[4835]: E0319 10:28:00.153805 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b028d50-4e99-426b-9d69-371fa9e45c15" containerName="oc" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.153823 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b028d50-4e99-426b-9d69-371fa9e45c15" containerName="oc" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.154151 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b028d50-4e99-426b-9d69-371fa9e45c15" containerName="oc" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.155302 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.157842 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.158061 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.158154 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.169715 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-gnmxv"] Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.318706 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xl5\" (UniqueName: \"kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5\") pod \"auto-csr-approver-29565268-gnmxv\" (UID: \"3e7e82bc-6c5a-4f0c-b426-8090d8919921\") " pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.421248 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xl5\" (UniqueName: \"kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5\") pod \"auto-csr-approver-29565268-gnmxv\" (UID: \"3e7e82bc-6c5a-4f0c-b426-8090d8919921\") " pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.440551 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xl5\" (UniqueName: \"kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5\") pod \"auto-csr-approver-29565268-gnmxv\" (UID: \"3e7e82bc-6c5a-4f0c-b426-8090d8919921\") " pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:00 crc kubenswrapper[4835]: I0319 10:28:00.478723 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:02 crc kubenswrapper[4835]: I0319 10:28:02.591245 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-gnmxv"] Mar 19 10:28:02 crc kubenswrapper[4835]: I0319 10:28:02.876389 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" event={"ID":"3e7e82bc-6c5a-4f0c-b426-8090d8919921","Type":"ContainerStarted","Data":"49d2785aff616af46eed09d5f34641805da234a8921c1040cbe789bf385b629d"} Mar 19 10:28:03 crc kubenswrapper[4835]: I0319 10:28:03.888410 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" event={"ID":"3e7e82bc-6c5a-4f0c-b426-8090d8919921","Type":"ContainerStarted","Data":"8e6a00c88b669b34e7271046301d05bb438124a0b4f6125ddea11732bd9c2f86"} Mar 19 10:28:03 crc kubenswrapper[4835]: I0319 10:28:03.904413 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" podStartSLOduration=3.021917231 podStartE2EDuration="3.904390952s" podCreationTimestamp="2026-03-19 10:28:00 +0000 UTC" firstStartedPulling="2026-03-19 10:28:02.602292761 +0000 UTC m=+3937.450891338" lastFinishedPulling="2026-03-19 10:28:03.484766482 +0000 UTC m=+3938.333365059" observedRunningTime="2026-03-19 10:28:03.901147532 +0000 UTC m=+3938.749746139" watchObservedRunningTime="2026-03-19 10:28:03.904390952 +0000 UTC m=+3938.752989539" Mar 19 10:28:04 crc kubenswrapper[4835]: I0319 10:28:04.901965 4835 generic.go:334] "Generic (PLEG): container finished" podID="3e7e82bc-6c5a-4f0c-b426-8090d8919921" containerID="8e6a00c88b669b34e7271046301d05bb438124a0b4f6125ddea11732bd9c2f86" exitCode=0 Mar 19 10:28:04 crc kubenswrapper[4835]: I0319 10:28:04.902077 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" event={"ID":"3e7e82bc-6c5a-4f0c-b426-8090d8919921","Type":"ContainerDied","Data":"8e6a00c88b669b34e7271046301d05bb438124a0b4f6125ddea11732bd9c2f86"} Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.354262 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.535140 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xl5\" (UniqueName: \"kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5\") pod \"3e7e82bc-6c5a-4f0c-b426-8090d8919921\" (UID: \"3e7e82bc-6c5a-4f0c-b426-8090d8919921\") " Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.543443 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5" (OuterVolumeSpecName: "kube-api-access-82xl5") pod "3e7e82bc-6c5a-4f0c-b426-8090d8919921" (UID: "3e7e82bc-6c5a-4f0c-b426-8090d8919921"). InnerVolumeSpecName "kube-api-access-82xl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.638136 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xl5\" (UniqueName: \"kubernetes.io/projected/3e7e82bc-6c5a-4f0c-b426-8090d8919921-kube-api-access-82xl5\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.935109 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.935012 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-gnmxv" event={"ID":"3e7e82bc-6c5a-4f0c-b426-8090d8919921","Type":"ContainerDied","Data":"49d2785aff616af46eed09d5f34641805da234a8921c1040cbe789bf385b629d"} Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.943771 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49d2785aff616af46eed09d5f34641805da234a8921c1040cbe789bf385b629d" Mar 19 10:28:06 crc kubenswrapper[4835]: I0319 10:28:06.987548 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565262-5x6sj"] Mar 19 10:28:07 crc kubenswrapper[4835]: I0319 10:28:06.999987 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565262-5x6sj"] Mar 19 10:28:08 crc kubenswrapper[4835]: I0319 10:28:08.402061 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:28:08 crc kubenswrapper[4835]: E0319 10:28:08.402703 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:28:08 crc kubenswrapper[4835]: I0319 10:28:08.415707 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10" path="/var/lib/kubelet/pods/5b0680c3-1a8d-4fa8-ba5c-ff20fe44ea10/volumes" Mar 19 10:28:20 crc kubenswrapper[4835]: I0319 10:28:20.404269 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:28:20 crc kubenswrapper[4835]: E0319 10:28:20.405471 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:28:33 crc kubenswrapper[4835]: I0319 10:28:33.402835 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:28:33 crc kubenswrapper[4835]: E0319 10:28:33.403818 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:28:45 crc kubenswrapper[4835]: I0319 10:28:45.402893 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:28:45 crc kubenswrapper[4835]: E0319 10:28:45.403897 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:28:58 crc kubenswrapper[4835]: I0319 10:28:58.182195 4835 scope.go:117] "RemoveContainer" containerID="9ccd8cae4b26ecbb3573fe3ed3c0cfb375c95b4c29d7c3019265ad7c0c8623e2" Mar 19 10:28:58 crc kubenswrapper[4835]: I0319 10:28:58.402738 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:28:58 crc kubenswrapper[4835]: E0319 10:28:58.403442 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:29:10 crc kubenswrapper[4835]: I0319 10:29:10.402908 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:29:10 crc kubenswrapper[4835]: I0319 10:29:10.724343 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b"} Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.578790 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:29:56 crc kubenswrapper[4835]: E0319 10:29:56.580135 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7e82bc-6c5a-4f0c-b426-8090d8919921" containerName="oc" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.580155 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7e82bc-6c5a-4f0c-b426-8090d8919921" containerName="oc" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.580465 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7e82bc-6c5a-4f0c-b426-8090d8919921" containerName="oc" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.582527 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.593013 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.629205 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.629244 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.629460 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdmn\" (UniqueName: \"kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.731908 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdmn\" (UniqueName: \"kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.732050 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.732071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.732542 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.732794 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.754224 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdmn\" (UniqueName: \"kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn\") pod \"certified-operators-6tvxj\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:56 crc kubenswrapper[4835]: I0319 10:29:56.903562 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:29:57 crc kubenswrapper[4835]: I0319 10:29:57.557607 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:29:58 crc kubenswrapper[4835]: I0319 10:29:58.261436 4835 generic.go:334] "Generic (PLEG): container finished" podID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerID="264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4" exitCode=0 Mar 19 10:29:58 crc kubenswrapper[4835]: I0319 10:29:58.262090 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerDied","Data":"264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4"} Mar 19 10:29:58 crc kubenswrapper[4835]: I0319 10:29:58.262119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerStarted","Data":"4dbcf4c3f229324e2233cd28772a6b2d8b90ae44cda013d2c8c5c384f9f06d2f"} Mar 19 10:29:58 crc kubenswrapper[4835]: I0319 10:29:58.991431 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:29:58 crc kubenswrapper[4835]: I0319 10:29:58.995731 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.004680 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.201799 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.201950 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.202008 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86qp\" (UniqueName: \"kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.274854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerStarted","Data":"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4"} Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.305010 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.305114 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w86qp\" (UniqueName: \"kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.305228 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.305692 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.305732 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.329658 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86qp\" (UniqueName: \"kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp\") pod \"redhat-operators-q6jrr\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.386709 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:29:59 crc kubenswrapper[4835]: I0319 10:29:59.971372 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.158120 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565270-pqf2p"] Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.160042 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.164005 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.164009 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.164428 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.175608 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s"] Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.177381 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.179236 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.179416 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.189445 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-pqf2p"] Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.206480 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s"] Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.228726 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvt7\" (UniqueName: \"kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.228897 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.228958 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4nm\" (UniqueName: \"kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm\") pod \"auto-csr-approver-29565270-pqf2p\" (UID: \"34eb65f3-516f-4070-8542-fd87fe4ca728\") " pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.229108 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.293603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerStarted","Data":"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b"} Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.293659 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerStarted","Data":"b38c3ae40a15e27b0d00fb14646ce9e3f43b16de324d72d03fad6608307c6719"} Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.332225 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvt7\" (UniqueName: \"kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.333093 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.333125 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4nm\" (UniqueName: \"kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm\") pod \"auto-csr-approver-29565270-pqf2p\" (UID: \"34eb65f3-516f-4070-8542-fd87fe4ca728\") " pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.333978 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.334660 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.342936 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.351497 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvt7\" (UniqueName: \"kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7\") pod \"collect-profiles-29565270-cgh4s\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.357304 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4nm\" (UniqueName: \"kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm\") pod \"auto-csr-approver-29565270-pqf2p\" (UID: \"34eb65f3-516f-4070-8542-fd87fe4ca728\") " pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.479786 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:00 crc kubenswrapper[4835]: I0319 10:30:00.501165 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:01 crc kubenswrapper[4835]: I0319 10:30:01.230062 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s"] Mar 19 10:30:01 crc kubenswrapper[4835]: I0319 10:30:01.308062 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" event={"ID":"48c6d60b-d8fc-459a-a3a8-179644451b1e","Type":"ContainerStarted","Data":"09c791dd1dc61ee793704b0da13ed9fb62babb153d80a9434dfd9e81db937df2"} Mar 19 10:30:01 crc kubenswrapper[4835]: I0319 10:30:01.351343 4835 generic.go:334] "Generic (PLEG): container finished" podID="03881495-d161-4b6b-b67f-7888f340fbde" containerID="7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b" exitCode=0 Mar 19 10:30:01 crc kubenswrapper[4835]: I0319 10:30:01.351411 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerDied","Data":"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b"} Mar 19 10:30:01 crc kubenswrapper[4835]: I0319 10:30:01.407910 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-pqf2p"] Mar 19 10:30:02 crc kubenswrapper[4835]: I0319 10:30:02.362594 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" event={"ID":"34eb65f3-516f-4070-8542-fd87fe4ca728","Type":"ContainerStarted","Data":"b0b4b2d7ffd88e00234cededa52a8954e57ea63161c65a6833f1bc062cd18829"} Mar 19 10:30:03 crc kubenswrapper[4835]: I0319 10:30:03.377499 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" event={"ID":"48c6d60b-d8fc-459a-a3a8-179644451b1e","Type":"ContainerStarted","Data":"a3055adf0ac14fdee8269547d908c009f469b076a1da8241d05b9c4e0fcdf410"} Mar 19 10:30:03 crc kubenswrapper[4835]: I0319 10:30:03.425848 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" podStartSLOduration=3.425825551 podStartE2EDuration="3.425825551s" podCreationTimestamp="2026-03-19 10:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:30:03.399756833 +0000 UTC m=+4058.248355420" watchObservedRunningTime="2026-03-19 10:30:03.425825551 +0000 UTC m=+4058.274424138" Mar 19 10:30:05 crc kubenswrapper[4835]: I0319 10:30:05.402795 4835 generic.go:334] "Generic (PLEG): container finished" podID="48c6d60b-d8fc-459a-a3a8-179644451b1e" containerID="a3055adf0ac14fdee8269547d908c009f469b076a1da8241d05b9c4e0fcdf410" exitCode=0 Mar 19 10:30:05 crc kubenswrapper[4835]: I0319 10:30:05.403302 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" event={"ID":"48c6d60b-d8fc-459a-a3a8-179644451b1e","Type":"ContainerDied","Data":"a3055adf0ac14fdee8269547d908c009f469b076a1da8241d05b9c4e0fcdf410"} Mar 19 10:30:06 crc kubenswrapper[4835]: I0319 10:30:06.426288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" event={"ID":"34eb65f3-516f-4070-8542-fd87fe4ca728","Type":"ContainerStarted","Data":"d844383a75e8d5b702ab268ac832bd4b96af94d98933639a0089ac8233b9582b"} Mar 19 10:30:06 crc kubenswrapper[4835]: I0319 10:30:06.426630 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerStarted","Data":"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655"} Mar 19 10:30:06 crc kubenswrapper[4835]: I0319 10:30:06.426856 4835 generic.go:334] "Generic (PLEG): container finished" podID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerID="192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4" exitCode=0 Mar 19 10:30:06 crc kubenswrapper[4835]: I0319 10:30:06.427104 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerDied","Data":"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4"} Mar 19 10:30:06 crc kubenswrapper[4835]: I0319 10:30:06.466140 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" podStartSLOduration=2.286951708 podStartE2EDuration="6.466122146s" podCreationTimestamp="2026-03-19 10:30:00 +0000 UTC" firstStartedPulling="2026-03-19 10:30:01.44087787 +0000 UTC m=+4056.289476457" lastFinishedPulling="2026-03-19 10:30:05.620048298 +0000 UTC m=+4060.468646895" observedRunningTime="2026-03-19 10:30:06.451320417 +0000 UTC m=+4061.299919004" watchObservedRunningTime="2026-03-19 10:30:06.466122146 +0000 UTC m=+4061.314720733" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.170340 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.286866 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume\") pod \"48c6d60b-d8fc-459a-a3a8-179644451b1e\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.288184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggvt7\" (UniqueName: \"kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7\") pod \"48c6d60b-d8fc-459a-a3a8-179644451b1e\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.288931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume\") pod \"48c6d60b-d8fc-459a-a3a8-179644451b1e\" (UID: \"48c6d60b-d8fc-459a-a3a8-179644451b1e\") " Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.288090 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume" (OuterVolumeSpecName: "config-volume") pod "48c6d60b-d8fc-459a-a3a8-179644451b1e" (UID: "48c6d60b-d8fc-459a-a3a8-179644451b1e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.298403 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "48c6d60b-d8fc-459a-a3a8-179644451b1e" (UID: "48c6d60b-d8fc-459a-a3a8-179644451b1e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.299169 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7" (OuterVolumeSpecName: "kube-api-access-ggvt7") pod "48c6d60b-d8fc-459a-a3a8-179644451b1e" (UID: "48c6d60b-d8fc-459a-a3a8-179644451b1e"). InnerVolumeSpecName "kube-api-access-ggvt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.392098 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48c6d60b-d8fc-459a-a3a8-179644451b1e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.392140 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48c6d60b-d8fc-459a-a3a8-179644451b1e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.392150 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggvt7\" (UniqueName: \"kubernetes.io/projected/48c6d60b-d8fc-459a-a3a8-179644451b1e-kube-api-access-ggvt7\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.440731 4835 generic.go:334] "Generic (PLEG): container finished" podID="34eb65f3-516f-4070-8542-fd87fe4ca728" containerID="d844383a75e8d5b702ab268ac832bd4b96af94d98933639a0089ac8233b9582b" exitCode=0 Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.440883 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" event={"ID":"34eb65f3-516f-4070-8542-fd87fe4ca728","Type":"ContainerDied","Data":"d844383a75e8d5b702ab268ac832bd4b96af94d98933639a0089ac8233b9582b"} Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.443446 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerStarted","Data":"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d"} Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.445028 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.445019 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-cgh4s" event={"ID":"48c6d60b-d8fc-459a-a3a8-179644451b1e","Type":"ContainerDied","Data":"09c791dd1dc61ee793704b0da13ed9fb62babb153d80a9434dfd9e81db937df2"} Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.445071 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c791dd1dc61ee793704b0da13ed9fb62babb153d80a9434dfd9e81db937df2" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.503478 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tvxj" podStartSLOduration=2.667705915 podStartE2EDuration="11.503452992s" podCreationTimestamp="2026-03-19 10:29:56 +0000 UTC" firstStartedPulling="2026-03-19 10:29:58.264633201 +0000 UTC m=+4053.113231788" lastFinishedPulling="2026-03-19 10:30:07.100380278 +0000 UTC m=+4061.948978865" observedRunningTime="2026-03-19 10:30:07.481001024 +0000 UTC m=+4062.329599631" watchObservedRunningTime="2026-03-19 10:30:07.503452992 +0000 UTC m=+4062.352051599" Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.520044 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp"] Mar 19 10:30:07 crc kubenswrapper[4835]: I0319 10:30:07.530455 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565225-jxftp"] Mar 19 10:30:08 crc kubenswrapper[4835]: I0319 10:30:08.470341 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d4f403-4f7a-408c-bc73-1899c4147d57" path="/var/lib/kubelet/pods/07d4f403-4f7a-408c-bc73-1899c4147d57/volumes" Mar 19 10:30:08 crc kubenswrapper[4835]: I0319 10:30:08.926703 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.033948 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4nm\" (UniqueName: \"kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm\") pod \"34eb65f3-516f-4070-8542-fd87fe4ca728\" (UID: \"34eb65f3-516f-4070-8542-fd87fe4ca728\") " Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.046959 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm" (OuterVolumeSpecName: "kube-api-access-sb4nm") pod "34eb65f3-516f-4070-8542-fd87fe4ca728" (UID: "34eb65f3-516f-4070-8542-fd87fe4ca728"). InnerVolumeSpecName "kube-api-access-sb4nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.137075 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4nm\" (UniqueName: \"kubernetes.io/projected/34eb65f3-516f-4070-8542-fd87fe4ca728-kube-api-access-sb4nm\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.498823 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" event={"ID":"34eb65f3-516f-4070-8542-fd87fe4ca728","Type":"ContainerDied","Data":"b0b4b2d7ffd88e00234cededa52a8954e57ea63161c65a6833f1bc062cd18829"} Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.498869 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b4b2d7ffd88e00234cededa52a8954e57ea63161c65a6833f1bc062cd18829" Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.498956 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-pqf2p" Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.566405 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-n68qv"] Mar 19 10:30:09 crc kubenswrapper[4835]: I0319 10:30:09.584627 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-n68qv"] Mar 19 10:30:10 crc kubenswrapper[4835]: I0319 10:30:10.416721 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08f067a-e10a-4eb1-a223-4d128d174b4e" path="/var/lib/kubelet/pods/b08f067a-e10a-4eb1-a223-4d128d174b4e/volumes" Mar 19 10:30:16 crc kubenswrapper[4835]: I0319 10:30:16.904268 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:16 crc kubenswrapper[4835]: I0319 10:30:16.904916 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:17 crc kubenswrapper[4835]: I0319 10:30:17.966367 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6tvxj" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" probeResult="failure" output=< Mar 19 10:30:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:30:17 crc kubenswrapper[4835]: > Mar 19 10:30:18 crc kubenswrapper[4835]: I0319 10:30:18.643187 4835 generic.go:334] "Generic (PLEG): container finished" podID="03881495-d161-4b6b-b67f-7888f340fbde" containerID="decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655" exitCode=0 Mar 19 10:30:18 crc kubenswrapper[4835]: I0319 10:30:18.643474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerDied","Data":"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655"} Mar 19 10:30:19 crc kubenswrapper[4835]: I0319 10:30:19.656891 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerStarted","Data":"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361"} Mar 19 10:30:20 crc kubenswrapper[4835]: I0319 10:30:20.691340 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q6jrr" podStartSLOduration=4.688554182 podStartE2EDuration="22.691312102s" podCreationTimestamp="2026-03-19 10:29:58 +0000 UTC" firstStartedPulling="2026-03-19 10:30:01.353676638 +0000 UTC m=+4056.202275225" lastFinishedPulling="2026-03-19 10:30:19.356434558 +0000 UTC m=+4074.205033145" observedRunningTime="2026-03-19 10:30:20.687258869 +0000 UTC m=+4075.535857476" watchObservedRunningTime="2026-03-19 10:30:20.691312102 +0000 UTC m=+4075.539910689" Mar 19 10:30:27 crc kubenswrapper[4835]: I0319 10:30:27.954350 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6tvxj" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" probeResult="failure" output=< Mar 19 10:30:27 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:30:27 crc kubenswrapper[4835]: > Mar 19 10:30:29 crc kubenswrapper[4835]: I0319 10:30:29.388145 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:29 crc kubenswrapper[4835]: I0319 10:30:29.388693 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:30 crc kubenswrapper[4835]: I0319 10:30:30.460492 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q6jrr" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="registry-server" probeResult="failure" output=< Mar 19 10:30:30 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:30:30 crc kubenswrapper[4835]: > Mar 19 10:30:36 crc kubenswrapper[4835]: I0319 10:30:36.950966 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:37 crc kubenswrapper[4835]: I0319 10:30:37.005156 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:37 crc kubenswrapper[4835]: I0319 10:30:37.192611 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:30:38 crc kubenswrapper[4835]: I0319 10:30:38.844909 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tvxj" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" containerID="cri-o://0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d" gracePeriod=2 Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.444853 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.453569 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.521723 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.625476 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content\") pod \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.625594 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdmn\" (UniqueName: \"kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn\") pod \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.625668 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities\") pod \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\" (UID: \"bffd4aa5-2723-4faa-b13d-03e6ed95d545\") " Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.626684 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities" (OuterVolumeSpecName: "utilities") pod "bffd4aa5-2723-4faa-b13d-03e6ed95d545" (UID: "bffd4aa5-2723-4faa-b13d-03e6ed95d545"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.633614 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn" (OuterVolumeSpecName: "kube-api-access-rjdmn") pod "bffd4aa5-2723-4faa-b13d-03e6ed95d545" (UID: "bffd4aa5-2723-4faa-b13d-03e6ed95d545"). InnerVolumeSpecName "kube-api-access-rjdmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.677339 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bffd4aa5-2723-4faa-b13d-03e6ed95d545" (UID: "bffd4aa5-2723-4faa-b13d-03e6ed95d545"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.728204 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.728248 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdmn\" (UniqueName: \"kubernetes.io/projected/bffd4aa5-2723-4faa-b13d-03e6ed95d545-kube-api-access-rjdmn\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.728264 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bffd4aa5-2723-4faa-b13d-03e6ed95d545-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.861251 4835 generic.go:334] "Generic (PLEG): container finished" podID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerID="0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d" exitCode=0 Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.861324 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tvxj" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.861352 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerDied","Data":"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d"} Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.861393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tvxj" event={"ID":"bffd4aa5-2723-4faa-b13d-03e6ed95d545","Type":"ContainerDied","Data":"4dbcf4c3f229324e2233cd28772a6b2d8b90ae44cda013d2c8c5c384f9f06d2f"} Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.861408 4835 scope.go:117] "RemoveContainer" containerID="0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.890977 4835 scope.go:117] "RemoveContainer" containerID="192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.904250 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.915883 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tvxj"] Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.938270 4835 scope.go:117] "RemoveContainer" containerID="264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.971702 4835 scope.go:117] "RemoveContainer" containerID="0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d" Mar 19 10:30:39 crc kubenswrapper[4835]: E0319 10:30:39.972227 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d\": container with ID starting with 0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d not found: ID does not exist" containerID="0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.972260 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d"} err="failed to get container status \"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d\": rpc error: code = NotFound desc = could not find container \"0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d\": container with ID starting with 0c71f2dd26380755f80195160c8bcba31a77962e922adabed42697746af1783d not found: ID does not exist" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.972283 4835 scope.go:117] "RemoveContainer" containerID="192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4" Mar 19 10:30:39 crc kubenswrapper[4835]: E0319 10:30:39.973616 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4\": container with ID starting with 192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4 not found: ID does not exist" containerID="192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.973648 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4"} err="failed to get container status \"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4\": rpc error: code = NotFound desc = could not find container \"192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4\": container with ID starting with 192351b19fb8f7d48626f5a25e3c694f627036c0bf9688e0c3511c73ce0fc0c4 not found: ID does not exist" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.973669 4835 scope.go:117] "RemoveContainer" containerID="264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4" Mar 19 10:30:39 crc kubenswrapper[4835]: E0319 10:30:39.974089 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4\": container with ID starting with 264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4 not found: ID does not exist" containerID="264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4" Mar 19 10:30:39 crc kubenswrapper[4835]: I0319 10:30:39.974125 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4"} err="failed to get container status \"264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4\": rpc error: code = NotFound desc = could not find container \"264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4\": container with ID starting with 264f03f073430e9db5aa2cf561baa0705cf56ab7b55ec54db116a4242797b9e4 not found: ID does not exist" Mar 19 10:30:40 crc kubenswrapper[4835]: I0319 10:30:40.415396 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" path="/var/lib/kubelet/pods/bffd4aa5-2723-4faa-b13d-03e6ed95d545/volumes" Mar 19 10:30:41 crc kubenswrapper[4835]: I0319 10:30:41.790373 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:30:41 crc kubenswrapper[4835]: I0319 10:30:41.790867 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q6jrr" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="registry-server" containerID="cri-o://6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361" gracePeriod=2 Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.347508 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.500832 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities\") pod \"03881495-d161-4b6b-b67f-7888f340fbde\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.500950 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content\") pod \"03881495-d161-4b6b-b67f-7888f340fbde\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.500969 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w86qp\" (UniqueName: \"kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp\") pod \"03881495-d161-4b6b-b67f-7888f340fbde\" (UID: \"03881495-d161-4b6b-b67f-7888f340fbde\") " Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.501787 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities" (OuterVolumeSpecName: "utilities") pod "03881495-d161-4b6b-b67f-7888f340fbde" (UID: "03881495-d161-4b6b-b67f-7888f340fbde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.516512 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp" (OuterVolumeSpecName: "kube-api-access-w86qp") pod "03881495-d161-4b6b-b67f-7888f340fbde" (UID: "03881495-d161-4b6b-b67f-7888f340fbde"). InnerVolumeSpecName "kube-api-access-w86qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.517053 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.517079 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w86qp\" (UniqueName: \"kubernetes.io/projected/03881495-d161-4b6b-b67f-7888f340fbde-kube-api-access-w86qp\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.651519 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03881495-d161-4b6b-b67f-7888f340fbde" (UID: "03881495-d161-4b6b-b67f-7888f340fbde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.723803 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03881495-d161-4b6b-b67f-7888f340fbde-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.902625 4835 generic.go:334] "Generic (PLEG): container finished" podID="03881495-d161-4b6b-b67f-7888f340fbde" containerID="6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361" exitCode=0 Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.902791 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jrr" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.902978 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerDied","Data":"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361"} Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.903023 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jrr" event={"ID":"03881495-d161-4b6b-b67f-7888f340fbde","Type":"ContainerDied","Data":"b38c3ae40a15e27b0d00fb14646ce9e3f43b16de324d72d03fad6608307c6719"} Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.903049 4835 scope.go:117] "RemoveContainer" containerID="6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.933871 4835 scope.go:117] "RemoveContainer" containerID="decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655" Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.941100 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.955375 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q6jrr"] Mar 19 10:30:42 crc kubenswrapper[4835]: I0319 10:30:42.962778 4835 scope.go:117] "RemoveContainer" containerID="7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.014592 4835 scope.go:117] "RemoveContainer" containerID="6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361" Mar 19 10:30:43 crc kubenswrapper[4835]: E0319 10:30:43.015040 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361\": container with ID starting with 6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361 not found: ID does not exist" containerID="6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.015074 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361"} err="failed to get container status \"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361\": rpc error: code = NotFound desc = could not find container \"6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361\": container with ID starting with 6e298df00016a6cf04bc608e8d39f6911c1a1e2a7adf3c6b9dc6b080b4e97361 not found: ID does not exist" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.015095 4835 scope.go:117] "RemoveContainer" containerID="decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655" Mar 19 10:30:43 crc kubenswrapper[4835]: E0319 10:30:43.015358 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655\": container with ID starting with decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655 not found: ID does not exist" containerID="decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.015397 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655"} err="failed to get container status \"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655\": rpc error: code = NotFound desc = could not find container \"decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655\": container with ID starting with decb6973ec3d3a4bb8c862ed63fd7a28f8227f52e43f2904e4aa13e443509655 not found: ID does not exist" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.015425 4835 scope.go:117] "RemoveContainer" containerID="7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b" Mar 19 10:30:43 crc kubenswrapper[4835]: E0319 10:30:43.016055 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b\": container with ID starting with 7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b not found: ID does not exist" containerID="7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b" Mar 19 10:30:43 crc kubenswrapper[4835]: I0319 10:30:43.016088 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b"} err="failed to get container status \"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b\": rpc error: code = NotFound desc = could not find container \"7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b\": container with ID starting with 7e5e50408e58021d0a6520ce38e8314fa56e4ce7e3bd818680eeae78df0d069b not found: ID does not exist" Mar 19 10:30:44 crc kubenswrapper[4835]: I0319 10:30:44.427570 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03881495-d161-4b6b-b67f-7888f340fbde" path="/var/lib/kubelet/pods/03881495-d161-4b6b-b67f-7888f340fbde/volumes" Mar 19 10:30:58 crc kubenswrapper[4835]: I0319 10:30:58.332305 4835 scope.go:117] "RemoveContainer" containerID="50bfbca42fdf77c3d9cfc2799a3d230aeec3b5d7d4e83d5ff0054eaf3d1d4324" Mar 19 10:30:58 crc kubenswrapper[4835]: I0319 10:30:58.405044 4835 scope.go:117] "RemoveContainer" containerID="0356e29dbb5c43cf10a7d9540e7e98579f8ca48662ee3d934af5a4e646ffe0fe" Mar 19 10:31:36 crc kubenswrapper[4835]: I0319 10:31:36.422757 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:31:36 crc kubenswrapper[4835]: I0319 10:31:36.423319 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:31:54 crc kubenswrapper[4835]: E0319 10:31:54.048211 4835 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.116:56154->38.129.56.116:37913: read tcp 38.129.56.116:56154->38.129.56.116:37913: read: connection reset by peer Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.148913 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565272-vzz5b"] Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150145 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="extract-content" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150160 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="extract-content" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150179 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150188 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150217 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="extract-content" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150229 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="extract-content" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150244 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="extract-utilities" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150250 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="extract-utilities" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150278 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150286 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150300 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c6d60b-d8fc-459a-a3a8-179644451b1e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150306 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c6d60b-d8fc-459a-a3a8-179644451b1e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150314 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="extract-utilities" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150320 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="extract-utilities" Mar 19 10:32:00 crc kubenswrapper[4835]: E0319 10:32:00.150336 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34eb65f3-516f-4070-8542-fd87fe4ca728" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150343 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="34eb65f3-516f-4070-8542-fd87fe4ca728" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150582 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="34eb65f3-516f-4070-8542-fd87fe4ca728" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150595 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c6d60b-d8fc-459a-a3a8-179644451b1e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150618 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bffd4aa5-2723-4faa-b13d-03e6ed95d545" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.150634 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="03881495-d161-4b6b-b67f-7888f340fbde" containerName="registry-server" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.151590 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.154597 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.154597 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.155137 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.162321 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-vzz5b"] Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.314248 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72l6v\" (UniqueName: \"kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v\") pod \"auto-csr-approver-29565272-vzz5b\" (UID: \"106b50cd-9332-49e1-8217-678440cb916b\") " pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.417598 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72l6v\" (UniqueName: \"kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v\") pod \"auto-csr-approver-29565272-vzz5b\" (UID: \"106b50cd-9332-49e1-8217-678440cb916b\") " pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:00 crc kubenswrapper[4835]: I0319 10:32:00.779938 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72l6v\" (UniqueName: \"kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v\") pod \"auto-csr-approver-29565272-vzz5b\" (UID: \"106b50cd-9332-49e1-8217-678440cb916b\") " pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:01 crc kubenswrapper[4835]: I0319 10:32:01.071195 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:01 crc kubenswrapper[4835]: I0319 10:32:01.545671 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:32:01 crc kubenswrapper[4835]: I0319 10:32:01.555365 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-vzz5b"] Mar 19 10:32:01 crc kubenswrapper[4835]: I0319 10:32:01.848634 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" event={"ID":"106b50cd-9332-49e1-8217-678440cb916b","Type":"ContainerStarted","Data":"9ca048d8d4e389b4616af8926d11bdb98ede2648b1be0f251818f630a2f0e98e"} Mar 19 10:32:03 crc kubenswrapper[4835]: I0319 10:32:03.902181 4835 generic.go:334] "Generic (PLEG): container finished" podID="106b50cd-9332-49e1-8217-678440cb916b" containerID="2053b44b5b9170e360b99ee9b9c8410318f6b2d2ccf21c6b4b6eb2e8339fb737" exitCode=0 Mar 19 10:32:03 crc kubenswrapper[4835]: I0319 10:32:03.902894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" event={"ID":"106b50cd-9332-49e1-8217-678440cb916b","Type":"ContainerDied","Data":"2053b44b5b9170e360b99ee9b9c8410318f6b2d2ccf21c6b4b6eb2e8339fb737"} Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.334194 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.465577 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72l6v\" (UniqueName: \"kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v\") pod \"106b50cd-9332-49e1-8217-678440cb916b\" (UID: \"106b50cd-9332-49e1-8217-678440cb916b\") " Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.483164 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v" (OuterVolumeSpecName: "kube-api-access-72l6v") pod "106b50cd-9332-49e1-8217-678440cb916b" (UID: "106b50cd-9332-49e1-8217-678440cb916b"). InnerVolumeSpecName "kube-api-access-72l6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.569107 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72l6v\" (UniqueName: \"kubernetes.io/projected/106b50cd-9332-49e1-8217-678440cb916b-kube-api-access-72l6v\") on node \"crc\" DevicePath \"\"" Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.931112 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" event={"ID":"106b50cd-9332-49e1-8217-678440cb916b","Type":"ContainerDied","Data":"9ca048d8d4e389b4616af8926d11bdb98ede2648b1be0f251818f630a2f0e98e"} Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.931453 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca048d8d4e389b4616af8926d11bdb98ede2648b1be0f251818f630a2f0e98e" Mar 19 10:32:05 crc kubenswrapper[4835]: I0319 10:32:05.931176 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-vzz5b" Mar 19 10:32:06 crc kubenswrapper[4835]: I0319 10:32:06.422853 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:32:06 crc kubenswrapper[4835]: I0319 10:32:06.422918 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:32:06 crc kubenswrapper[4835]: I0319 10:32:06.423465 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-z5z6r"] Mar 19 10:32:06 crc kubenswrapper[4835]: I0319 10:32:06.437833 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-z5z6r"] Mar 19 10:32:08 crc kubenswrapper[4835]: I0319 10:32:08.420401 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b028d50-4e99-426b-9d69-371fa9e45c15" path="/var/lib/kubelet/pods/0b028d50-4e99-426b-9d69-371fa9e45c15/volumes" Mar 19 10:32:36 crc kubenswrapper[4835]: I0319 10:32:36.422008 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:32:36 crc kubenswrapper[4835]: I0319 10:32:36.422537 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:32:36 crc kubenswrapper[4835]: I0319 10:32:36.422572 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:32:36 crc kubenswrapper[4835]: I0319 10:32:36.423117 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:32:36 crc kubenswrapper[4835]: I0319 10:32:36.423167 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b" gracePeriod=600 Mar 19 10:32:37 crc kubenswrapper[4835]: I0319 10:32:37.286231 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b" exitCode=0 Mar 19 10:32:37 crc kubenswrapper[4835]: I0319 10:32:37.286288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b"} Mar 19 10:32:37 crc kubenswrapper[4835]: I0319 10:32:37.286981 4835 scope.go:117] "RemoveContainer" containerID="049ab495373838cc6452b89e97abd4272ac73f52f2dcf346477b2cddb7b2086a" Mar 19 10:32:37 crc kubenswrapper[4835]: I0319 10:32:37.288097 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9"} Mar 19 10:32:58 crc kubenswrapper[4835]: I0319 10:32:58.617647 4835 scope.go:117] "RemoveContainer" containerID="e57e3290e79a8759c3ff4f2094d9b0a239dc0631797205fd72ab1b9722fb50db" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.446262 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:33:55 crc kubenswrapper[4835]: E0319 10:33:55.447370 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106b50cd-9332-49e1-8217-678440cb916b" containerName="oc" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.447387 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="106b50cd-9332-49e1-8217-678440cb916b" containerName="oc" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.447777 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="106b50cd-9332-49e1-8217-678440cb916b" containerName="oc" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.449901 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.459646 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.530144 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5cg\" (UniqueName: \"kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.530355 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.530584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.633419 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.633679 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5cg\" (UniqueName: \"kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.633844 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.633943 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.634222 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:55 crc kubenswrapper[4835]: I0319 10:33:55.983613 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5cg\" (UniqueName: \"kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg\") pod \"redhat-marketplace-8wmwf\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:56 crc kubenswrapper[4835]: I0319 10:33:56.069281 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:33:56 crc kubenswrapper[4835]: I0319 10:33:56.633698 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:33:57 crc kubenswrapper[4835]: I0319 10:33:57.167020 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c087e72-2b20-4331-9a49-70a62b50387e" containerID="64231767b2bb55c390ac263bba1179032103c91257ec8503650a42b308ba7421" exitCode=0 Mar 19 10:33:57 crc kubenswrapper[4835]: I0319 10:33:57.167296 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerDied","Data":"64231767b2bb55c390ac263bba1179032103c91257ec8503650a42b308ba7421"} Mar 19 10:33:57 crc kubenswrapper[4835]: I0319 10:33:57.167322 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerStarted","Data":"1840c5a598f046efb6b759023050200df1ea03a47014f534f8f4b19510554399"} Mar 19 10:33:59 crc kubenswrapper[4835]: I0319 10:33:59.186652 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerStarted","Data":"6608098a3e414c5c60a28b868776a69e3a4a543339b6e216e5857d2bff9065c8"} Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.181838 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565274-997fb"] Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.186286 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.189411 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.189652 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.191398 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.199318 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-997fb"] Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.244649 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh979\" (UniqueName: \"kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979\") pod \"auto-csr-approver-29565274-997fb\" (UID: \"93c518a4-f4f0-4238-add8-13fb7b7b6b08\") " pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.346949 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh979\" (UniqueName: \"kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979\") pod \"auto-csr-approver-29565274-997fb\" (UID: \"93c518a4-f4f0-4238-add8-13fb7b7b6b08\") " pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.370039 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh979\" (UniqueName: \"kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979\") pod \"auto-csr-approver-29565274-997fb\" (UID: \"93c518a4-f4f0-4238-add8-13fb7b7b6b08\") " pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:00 crc kubenswrapper[4835]: I0319 10:34:00.521971 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:01 crc kubenswrapper[4835]: W0319 10:34:01.037857 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c518a4_f4f0_4238_add8_13fb7b7b6b08.slice/crio-283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb WatchSource:0}: Error finding container 283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb: Status 404 returned error can't find the container with id 283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb Mar 19 10:34:01 crc kubenswrapper[4835]: I0319 10:34:01.039991 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-997fb"] Mar 19 10:34:01 crc kubenswrapper[4835]: I0319 10:34:01.216936 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c087e72-2b20-4331-9a49-70a62b50387e" containerID="6608098a3e414c5c60a28b868776a69e3a4a543339b6e216e5857d2bff9065c8" exitCode=0 Mar 19 10:34:01 crc kubenswrapper[4835]: I0319 10:34:01.217084 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerDied","Data":"6608098a3e414c5c60a28b868776a69e3a4a543339b6e216e5857d2bff9065c8"} Mar 19 10:34:01 crc kubenswrapper[4835]: I0319 10:34:01.218480 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-997fb" event={"ID":"93c518a4-f4f0-4238-add8-13fb7b7b6b08","Type":"ContainerStarted","Data":"283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb"} Mar 19 10:34:02 crc kubenswrapper[4835]: I0319 10:34:02.250521 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerStarted","Data":"1ffebf5a33426282c076f871af7b30a206b21811e9c560ec37a548272d4915bd"} Mar 19 10:34:02 crc kubenswrapper[4835]: I0319 10:34:02.294002 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8wmwf" podStartSLOduration=2.798905188 podStartE2EDuration="7.29397948s" podCreationTimestamp="2026-03-19 10:33:55 +0000 UTC" firstStartedPulling="2026-03-19 10:33:57.169814635 +0000 UTC m=+4292.018413232" lastFinishedPulling="2026-03-19 10:34:01.664888937 +0000 UTC m=+4296.513487524" observedRunningTime="2026-03-19 10:34:02.277989701 +0000 UTC m=+4297.126588298" watchObservedRunningTime="2026-03-19 10:34:02.29397948 +0000 UTC m=+4297.142578067" Mar 19 10:34:03 crc kubenswrapper[4835]: I0319 10:34:03.263442 4835 generic.go:334] "Generic (PLEG): container finished" podID="93c518a4-f4f0-4238-add8-13fb7b7b6b08" containerID="e2472834fbcf2cb21e367e671de40b2cd7c30d7b5caa60d7db218a02d0979136" exitCode=0 Mar 19 10:34:03 crc kubenswrapper[4835]: I0319 10:34:03.264117 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-997fb" event={"ID":"93c518a4-f4f0-4238-add8-13fb7b7b6b08","Type":"ContainerDied","Data":"e2472834fbcf2cb21e367e671de40b2cd7c30d7b5caa60d7db218a02d0979136"} Mar 19 10:34:04 crc kubenswrapper[4835]: I0319 10:34:04.828259 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:04 crc kubenswrapper[4835]: I0319 10:34:04.867372 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh979\" (UniqueName: \"kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979\") pod \"93c518a4-f4f0-4238-add8-13fb7b7b6b08\" (UID: \"93c518a4-f4f0-4238-add8-13fb7b7b6b08\") " Mar 19 10:34:04 crc kubenswrapper[4835]: I0319 10:34:04.878978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979" (OuterVolumeSpecName: "kube-api-access-bh979") pod "93c518a4-f4f0-4238-add8-13fb7b7b6b08" (UID: "93c518a4-f4f0-4238-add8-13fb7b7b6b08"). InnerVolumeSpecName "kube-api-access-bh979". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:34:04 crc kubenswrapper[4835]: I0319 10:34:04.971318 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh979\" (UniqueName: \"kubernetes.io/projected/93c518a4-f4f0-4238-add8-13fb7b7b6b08-kube-api-access-bh979\") on node \"crc\" DevicePath \"\"" Mar 19 10:34:05 crc kubenswrapper[4835]: I0319 10:34:05.285894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-997fb" event={"ID":"93c518a4-f4f0-4238-add8-13fb7b7b6b08","Type":"ContainerDied","Data":"283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb"} Mar 19 10:34:05 crc kubenswrapper[4835]: I0319 10:34:05.286196 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="283ee11375855da2cbb2b159dcf5b2d06c53852da9b449826d8fd1b58e71f2eb" Mar 19 10:34:05 crc kubenswrapper[4835]: I0319 10:34:05.285970 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-997fb" Mar 19 10:34:05 crc kubenswrapper[4835]: I0319 10:34:05.905518 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-gnmxv"] Mar 19 10:34:05 crc kubenswrapper[4835]: I0319 10:34:05.916058 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-gnmxv"] Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.069406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.069463 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.114064 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.362731 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.421948 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7e82bc-6c5a-4f0c-b426-8090d8919921" path="/var/lib/kubelet/pods/3e7e82bc-6c5a-4f0c-b426-8090d8919921/volumes" Mar 19 10:34:06 crc kubenswrapper[4835]: I0319 10:34:06.422838 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:34:08 crc kubenswrapper[4835]: I0319 10:34:08.322655 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8wmwf" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="registry-server" containerID="cri-o://1ffebf5a33426282c076f871af7b30a206b21811e9c560ec37a548272d4915bd" gracePeriod=2 Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.335513 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c087e72-2b20-4331-9a49-70a62b50387e" containerID="1ffebf5a33426282c076f871af7b30a206b21811e9c560ec37a548272d4915bd" exitCode=0 Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.335688 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerDied","Data":"1ffebf5a33426282c076f871af7b30a206b21811e9c560ec37a548272d4915bd"} Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.335925 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wmwf" event={"ID":"0c087e72-2b20-4331-9a49-70a62b50387e","Type":"ContainerDied","Data":"1840c5a598f046efb6b759023050200df1ea03a47014f534f8f4b19510554399"} Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.335945 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1840c5a598f046efb6b759023050200df1ea03a47014f534f8f4b19510554399" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.414018 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.604060 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w5cg\" (UniqueName: \"kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg\") pod \"0c087e72-2b20-4331-9a49-70a62b50387e\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.605730 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities\") pod \"0c087e72-2b20-4331-9a49-70a62b50387e\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.606392 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities" (OuterVolumeSpecName: "utilities") pod "0c087e72-2b20-4331-9a49-70a62b50387e" (UID: "0c087e72-2b20-4331-9a49-70a62b50387e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.606842 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content\") pod \"0c087e72-2b20-4331-9a49-70a62b50387e\" (UID: \"0c087e72-2b20-4331-9a49-70a62b50387e\") " Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.608684 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.616788 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg" (OuterVolumeSpecName: "kube-api-access-4w5cg") pod "0c087e72-2b20-4331-9a49-70a62b50387e" (UID: "0c087e72-2b20-4331-9a49-70a62b50387e"). InnerVolumeSpecName "kube-api-access-4w5cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.708998 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c087e72-2b20-4331-9a49-70a62b50387e" (UID: "0c087e72-2b20-4331-9a49-70a62b50387e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.711178 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c087e72-2b20-4331-9a49-70a62b50387e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:34:09 crc kubenswrapper[4835]: I0319 10:34:09.711218 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w5cg\" (UniqueName: \"kubernetes.io/projected/0c087e72-2b20-4331-9a49-70a62b50387e-kube-api-access-4w5cg\") on node \"crc\" DevicePath \"\"" Mar 19 10:34:10 crc kubenswrapper[4835]: I0319 10:34:10.347216 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wmwf" Mar 19 10:34:10 crc kubenswrapper[4835]: I0319 10:34:10.417535 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:34:10 crc kubenswrapper[4835]: I0319 10:34:10.424218 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wmwf"] Mar 19 10:34:12 crc kubenswrapper[4835]: I0319 10:34:12.418305 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" path="/var/lib/kubelet/pods/0c087e72-2b20-4331-9a49-70a62b50387e/volumes" Mar 19 10:34:36 crc kubenswrapper[4835]: I0319 10:34:36.423053 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:34:36 crc kubenswrapper[4835]: I0319 10:34:36.424642 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:34:58 crc kubenswrapper[4835]: I0319 10:34:58.729298 4835 scope.go:117] "RemoveContainer" containerID="8e6a00c88b669b34e7271046301d05bb438124a0b4f6125ddea11732bd9c2f86" Mar 19 10:35:06 crc kubenswrapper[4835]: I0319 10:35:06.422851 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:35:06 crc kubenswrapper[4835]: I0319 10:35:06.423449 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:35:36 crc kubenswrapper[4835]: I0319 10:35:36.422252 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:35:36 crc kubenswrapper[4835]: I0319 10:35:36.422906 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:35:36 crc kubenswrapper[4835]: I0319 10:35:36.422951 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:35:36 crc kubenswrapper[4835]: I0319 10:35:36.423471 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:35:36 crc kubenswrapper[4835]: I0319 10:35:36.423523 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" gracePeriod=600 Mar 19 10:35:36 crc kubenswrapper[4835]: E0319 10:35:36.546008 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:35:37 crc kubenswrapper[4835]: I0319 10:35:37.314117 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" exitCode=0 Mar 19 10:35:37 crc kubenswrapper[4835]: I0319 10:35:37.314225 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9"} Mar 19 10:35:37 crc kubenswrapper[4835]: I0319 10:35:37.314508 4835 scope.go:117] "RemoveContainer" containerID="9ecf131f16a5e6b0f5547bfdefea409692cba49068da1b192acca5d89a47395b" Mar 19 10:35:37 crc kubenswrapper[4835]: I0319 10:35:37.316496 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:35:37 crc kubenswrapper[4835]: E0319 10:35:37.319806 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:35:49 crc kubenswrapper[4835]: I0319 10:35:49.403116 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:35:49 crc kubenswrapper[4835]: E0319 10:35:49.404527 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.143306 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565276-c962s"] Mar 19 10:36:00 crc kubenswrapper[4835]: E0319 10:36:00.144482 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="extract-content" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144497 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="extract-content" Mar 19 10:36:00 crc kubenswrapper[4835]: E0319 10:36:00.144522 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="registry-server" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144528 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="registry-server" Mar 19 10:36:00 crc kubenswrapper[4835]: E0319 10:36:00.144555 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c518a4-f4f0-4238-add8-13fb7b7b6b08" containerName="oc" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144563 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c518a4-f4f0-4238-add8-13fb7b7b6b08" containerName="oc" Mar 19 10:36:00 crc kubenswrapper[4835]: E0319 10:36:00.144587 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="extract-utilities" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144593 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="extract-utilities" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144860 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c518a4-f4f0-4238-add8-13fb7b7b6b08" containerName="oc" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.144911 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c087e72-2b20-4331-9a49-70a62b50387e" containerName="registry-server" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.145840 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.148207 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.150078 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.153283 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.157501 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-c962s"] Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.263656 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwxm\" (UniqueName: \"kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm\") pod \"auto-csr-approver-29565276-c962s\" (UID: \"c0f42988-08ec-4ebe-941f-c1984a620a40\") " pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:00 crc kubenswrapper[4835]: I0319 10:36:00.366098 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwxm\" (UniqueName: \"kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm\") pod \"auto-csr-approver-29565276-c962s\" (UID: \"c0f42988-08ec-4ebe-941f-c1984a620a40\") " pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:01 crc kubenswrapper[4835]: I0319 10:36:01.079250 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwxm\" (UniqueName: \"kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm\") pod \"auto-csr-approver-29565276-c962s\" (UID: \"c0f42988-08ec-4ebe-941f-c1984a620a40\") " pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:01 crc kubenswrapper[4835]: I0319 10:36:01.367924 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:02 crc kubenswrapper[4835]: I0319 10:36:02.135456 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-c962s"] Mar 19 10:36:02 crc kubenswrapper[4835]: I0319 10:36:02.599303 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-c962s" event={"ID":"c0f42988-08ec-4ebe-941f-c1984a620a40","Type":"ContainerStarted","Data":"b91a9156f5d1d9a75a44e5f89f065b83f02d4aa8fa158c919405f621a820c38d"} Mar 19 10:36:03 crc kubenswrapper[4835]: I0319 10:36:03.612696 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-c962s" event={"ID":"c0f42988-08ec-4ebe-941f-c1984a620a40","Type":"ContainerStarted","Data":"1346ffcb5f9d90af9c6a1c3eed1b17581df6108357cb93f2423f2e8665e87bdc"} Mar 19 10:36:03 crc kubenswrapper[4835]: I0319 10:36:03.640622 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565276-c962s" podStartSLOduration=2.67592302 podStartE2EDuration="3.640598795s" podCreationTimestamp="2026-03-19 10:36:00 +0000 UTC" firstStartedPulling="2026-03-19 10:36:02.139678485 +0000 UTC m=+4416.988277072" lastFinishedPulling="2026-03-19 10:36:03.10435426 +0000 UTC m=+4417.952952847" observedRunningTime="2026-03-19 10:36:03.636065951 +0000 UTC m=+4418.484664538" watchObservedRunningTime="2026-03-19 10:36:03.640598795 +0000 UTC m=+4418.489197392" Mar 19 10:36:04 crc kubenswrapper[4835]: I0319 10:36:04.402727 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:36:04 crc kubenswrapper[4835]: E0319 10:36:04.403444 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:36:04 crc kubenswrapper[4835]: I0319 10:36:04.624327 4835 generic.go:334] "Generic (PLEG): container finished" podID="c0f42988-08ec-4ebe-941f-c1984a620a40" containerID="1346ffcb5f9d90af9c6a1c3eed1b17581df6108357cb93f2423f2e8665e87bdc" exitCode=0 Mar 19 10:36:04 crc kubenswrapper[4835]: I0319 10:36:04.625120 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-c962s" event={"ID":"c0f42988-08ec-4ebe-941f-c1984a620a40","Type":"ContainerDied","Data":"1346ffcb5f9d90af9c6a1c3eed1b17581df6108357cb93f2423f2e8665e87bdc"} Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.066067 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.193313 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwxm\" (UniqueName: \"kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm\") pod \"c0f42988-08ec-4ebe-941f-c1984a620a40\" (UID: \"c0f42988-08ec-4ebe-941f-c1984a620a40\") " Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.206984 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm" (OuterVolumeSpecName: "kube-api-access-clwxm") pod "c0f42988-08ec-4ebe-941f-c1984a620a40" (UID: "c0f42988-08ec-4ebe-941f-c1984a620a40"). InnerVolumeSpecName "kube-api-access-clwxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.297352 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwxm\" (UniqueName: \"kubernetes.io/projected/c0f42988-08ec-4ebe-941f-c1984a620a40-kube-api-access-clwxm\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.648950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-c962s" event={"ID":"c0f42988-08ec-4ebe-941f-c1984a620a40","Type":"ContainerDied","Data":"b91a9156f5d1d9a75a44e5f89f065b83f02d4aa8fa158c919405f621a820c38d"} Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.648991 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-c962s" Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.649011 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b91a9156f5d1d9a75a44e5f89f065b83f02d4aa8fa158c919405f621a820c38d" Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.714975 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-pqf2p"] Mar 19 10:36:06 crc kubenswrapper[4835]: I0319 10:36:06.727400 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-pqf2p"] Mar 19 10:36:08 crc kubenswrapper[4835]: I0319 10:36:08.423532 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34eb65f3-516f-4070-8542-fd87fe4ca728" path="/var/lib/kubelet/pods/34eb65f3-516f-4070-8542-fd87fe4ca728/volumes" Mar 19 10:36:17 crc kubenswrapper[4835]: I0319 10:36:17.402351 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:36:17 crc kubenswrapper[4835]: E0319 10:36:17.403652 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:36:32 crc kubenswrapper[4835]: I0319 10:36:32.408320 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:36:32 crc kubenswrapper[4835]: E0319 10:36:32.409414 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:36:44 crc kubenswrapper[4835]: I0319 10:36:44.402990 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:36:44 crc kubenswrapper[4835]: E0319 10:36:44.405055 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:36:58 crc kubenswrapper[4835]: I0319 10:36:58.849428 4835 scope.go:117] "RemoveContainer" containerID="d844383a75e8d5b702ab268ac832bd4b96af94d98933639a0089ac8233b9582b" Mar 19 10:36:59 crc kubenswrapper[4835]: I0319 10:36:59.402819 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:36:59 crc kubenswrapper[4835]: E0319 10:36:59.404985 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:37:13 crc kubenswrapper[4835]: I0319 10:37:13.402557 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:37:13 crc kubenswrapper[4835]: E0319 10:37:13.403729 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:37:25 crc kubenswrapper[4835]: I0319 10:37:25.402824 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:37:25 crc kubenswrapper[4835]: E0319 10:37:25.403721 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:37:37 crc kubenswrapper[4835]: I0319 10:37:37.402364 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:37:37 crc kubenswrapper[4835]: E0319 10:37:37.403671 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:37:52 crc kubenswrapper[4835]: I0319 10:37:52.404162 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:37:52 crc kubenswrapper[4835]: E0319 10:37:52.406668 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.155140 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565278-v2nnh"] Mar 19 10:38:00 crc kubenswrapper[4835]: E0319 10:38:00.156322 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f42988-08ec-4ebe-941f-c1984a620a40" containerName="oc" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.156340 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f42988-08ec-4ebe-941f-c1984a620a40" containerName="oc" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.156689 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f42988-08ec-4ebe-941f-c1984a620a40" containerName="oc" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.157932 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.160356 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.160637 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.161583 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.178395 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-v2nnh"] Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.190522 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87bt\" (UniqueName: \"kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt\") pod \"auto-csr-approver-29565278-v2nnh\" (UID: \"5bd41c8b-8f1f-4707-9ecd-dcac069b8601\") " pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.293136 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v87bt\" (UniqueName: \"kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt\") pod \"auto-csr-approver-29565278-v2nnh\" (UID: \"5bd41c8b-8f1f-4707-9ecd-dcac069b8601\") " pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.314910 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87bt\" (UniqueName: \"kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt\") pod \"auto-csr-approver-29565278-v2nnh\" (UID: \"5bd41c8b-8f1f-4707-9ecd-dcac069b8601\") " pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.480657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.969821 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:38:00 crc kubenswrapper[4835]: I0319 10:38:00.974265 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-v2nnh"] Mar 19 10:38:01 crc kubenswrapper[4835]: I0319 10:38:01.962306 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" event={"ID":"5bd41c8b-8f1f-4707-9ecd-dcac069b8601","Type":"ContainerStarted","Data":"a4d676e6028bd6d939d51a069f5c7023b5dfd10be0e6394bcd0c6097ad643568"} Mar 19 10:38:02 crc kubenswrapper[4835]: I0319 10:38:02.973480 4835 generic.go:334] "Generic (PLEG): container finished" podID="5bd41c8b-8f1f-4707-9ecd-dcac069b8601" containerID="ab9588b05466d098683c827857f9fb54a571c1b570ae5065737b693429dd9731" exitCode=0 Mar 19 10:38:02 crc kubenswrapper[4835]: I0319 10:38:02.973933 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" event={"ID":"5bd41c8b-8f1f-4707-9ecd-dcac069b8601","Type":"ContainerDied","Data":"ab9588b05466d098683c827857f9fb54a571c1b570ae5065737b693429dd9731"} Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.714608 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.902275 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v87bt\" (UniqueName: \"kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt\") pod \"5bd41c8b-8f1f-4707-9ecd-dcac069b8601\" (UID: \"5bd41c8b-8f1f-4707-9ecd-dcac069b8601\") " Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.909081 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt" (OuterVolumeSpecName: "kube-api-access-v87bt") pod "5bd41c8b-8f1f-4707-9ecd-dcac069b8601" (UID: "5bd41c8b-8f1f-4707-9ecd-dcac069b8601"). InnerVolumeSpecName "kube-api-access-v87bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.995196 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" event={"ID":"5bd41c8b-8f1f-4707-9ecd-dcac069b8601","Type":"ContainerDied","Data":"a4d676e6028bd6d939d51a069f5c7023b5dfd10be0e6394bcd0c6097ad643568"} Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.995238 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4d676e6028bd6d939d51a069f5c7023b5dfd10be0e6394bcd0c6097ad643568" Mar 19 10:38:04 crc kubenswrapper[4835]: I0319 10:38:04.995271 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-v2nnh" Mar 19 10:38:05 crc kubenswrapper[4835]: I0319 10:38:05.006516 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v87bt\" (UniqueName: \"kubernetes.io/projected/5bd41c8b-8f1f-4707-9ecd-dcac069b8601-kube-api-access-v87bt\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:05 crc kubenswrapper[4835]: I0319 10:38:05.402616 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:38:05 crc kubenswrapper[4835]: E0319 10:38:05.403412 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:38:05 crc kubenswrapper[4835]: I0319 10:38:05.816326 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-vzz5b"] Mar 19 10:38:05 crc kubenswrapper[4835]: I0319 10:38:05.827070 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-vzz5b"] Mar 19 10:38:06 crc kubenswrapper[4835]: I0319 10:38:06.419881 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106b50cd-9332-49e1-8217-678440cb916b" path="/var/lib/kubelet/pods/106b50cd-9332-49e1-8217-678440cb916b/volumes" Mar 19 10:38:19 crc kubenswrapper[4835]: I0319 10:38:19.401676 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:38:19 crc kubenswrapper[4835]: E0319 10:38:19.402466 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:38:32 crc kubenswrapper[4835]: I0319 10:38:32.402927 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:38:32 crc kubenswrapper[4835]: E0319 10:38:32.403797 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:38:45 crc kubenswrapper[4835]: I0319 10:38:45.402058 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:38:45 crc kubenswrapper[4835]: E0319 10:38:45.402779 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:38:59 crc kubenswrapper[4835]: I0319 10:38:59.000151 4835 scope.go:117] "RemoveContainer" containerID="2053b44b5b9170e360b99ee9b9c8410318f6b2d2ccf21c6b4b6eb2e8339fb737" Mar 19 10:39:00 crc kubenswrapper[4835]: I0319 10:39:00.403058 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:39:00 crc kubenswrapper[4835]: E0319 10:39:00.403962 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:39:14 crc kubenswrapper[4835]: I0319 10:39:14.403348 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:39:14 crc kubenswrapper[4835]: E0319 10:39:14.404336 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:39:27 crc kubenswrapper[4835]: I0319 10:39:27.401701 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:39:27 crc kubenswrapper[4835]: E0319 10:39:27.402420 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.212231 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:39:32 crc kubenswrapper[4835]: E0319 10:39:32.213368 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd41c8b-8f1f-4707-9ecd-dcac069b8601" containerName="oc" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.213383 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd41c8b-8f1f-4707-9ecd-dcac069b8601" containerName="oc" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.213701 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd41c8b-8f1f-4707-9ecd-dcac069b8601" containerName="oc" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.215857 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.236645 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.269793 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzjt\" (UniqueName: \"kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.269881 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.270042 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.372485 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.372959 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.373066 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.373214 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzjt\" (UniqueName: \"kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.373432 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.407872 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzjt\" (UniqueName: \"kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt\") pod \"community-operators-k8j84\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:32 crc kubenswrapper[4835]: I0319 10:39:32.541432 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:39:33 crc kubenswrapper[4835]: I0319 10:39:33.155110 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:39:34 crc kubenswrapper[4835]: I0319 10:39:34.081780 4835 generic.go:334] "Generic (PLEG): container finished" podID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerID="298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a" exitCode=0 Mar 19 10:39:34 crc kubenswrapper[4835]: I0319 10:39:34.081827 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerDied","Data":"298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a"} Mar 19 10:39:34 crc kubenswrapper[4835]: I0319 10:39:34.082054 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerStarted","Data":"5a49b83334c5f7e8aeec03e3eddde2c61026b1837fc5568d07b445810b121994"} Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.099257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerStarted","Data":"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb"} Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.279913 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.283075 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.285834 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.285976 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.285975 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n8mkp" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.286896 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.299882 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.355151 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.355285 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.355327 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.355478 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.355624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.458865 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.458946 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459022 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nphdq\" (UniqueName: \"kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459210 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459239 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459267 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459342 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.459369 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.473292 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.480695 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.498598 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.519430 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.519701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.552266 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.562574 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nphdq\" (UniqueName: \"kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.562707 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.562824 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.563017 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.566931 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.570213 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.570848 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.596068 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nphdq\" (UniqueName: \"kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq\") pod \"tempest-tests-tempest\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " pod="openstack/tempest-tests-tempest" Mar 19 10:39:35 crc kubenswrapper[4835]: I0319 10:39:35.646596 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 10:39:36 crc kubenswrapper[4835]: I0319 10:39:36.238796 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 10:39:36 crc kubenswrapper[4835]: W0319 10:39:36.583858 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9262f37d_2193_46c6_9706_5d039fa94926.slice/crio-e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756 WatchSource:0}: Error finding container e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756: Status 404 returned error can't find the container with id e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756 Mar 19 10:39:37 crc kubenswrapper[4835]: I0319 10:39:37.130950 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9262f37d-2193-46c6-9706-5d039fa94926","Type":"ContainerStarted","Data":"e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756"} Mar 19 10:39:39 crc kubenswrapper[4835]: I0319 10:39:39.168960 4835 generic.go:334] "Generic (PLEG): container finished" podID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerID="f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb" exitCode=0 Mar 19 10:39:39 crc kubenswrapper[4835]: I0319 10:39:39.169049 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerDied","Data":"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb"} Mar 19 10:39:40 crc kubenswrapper[4835]: I0319 10:39:40.403173 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:39:40 crc kubenswrapper[4835]: E0319 10:39:40.403826 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:39:51 crc kubenswrapper[4835]: I0319 10:39:51.401982 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:39:51 crc kubenswrapper[4835]: E0319 10:39:51.403212 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:39:59 crc kubenswrapper[4835]: I0319 10:39:59.388172 4835 scope.go:117] "RemoveContainer" containerID="64231767b2bb55c390ac263bba1179032103c91257ec8503650a42b308ba7421" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.160751 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565280-4mxsw"] Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.163301 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.166175 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.166343 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.166457 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.175926 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-4mxsw"] Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.273680 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zvgk\" (UniqueName: \"kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk\") pod \"auto-csr-approver-29565280-4mxsw\" (UID: \"bc08d060-225f-4f8d-bb5e-2de67700397b\") " pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.377825 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zvgk\" (UniqueName: \"kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk\") pod \"auto-csr-approver-29565280-4mxsw\" (UID: \"bc08d060-225f-4f8d-bb5e-2de67700397b\") " pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.396459 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zvgk\" (UniqueName: \"kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk\") pod \"auto-csr-approver-29565280-4mxsw\" (UID: \"bc08d060-225f-4f8d-bb5e-2de67700397b\") " pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:00 crc kubenswrapper[4835]: I0319 10:40:00.490707 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:05 crc kubenswrapper[4835]: I0319 10:40:05.403264 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:40:05 crc kubenswrapper[4835]: E0319 10:40:05.404199 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.627333 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.630668 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.651459 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.725770 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwbn\" (UniqueName: \"kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.725920 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.726034 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.828168 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwbn\" (UniqueName: \"kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.828285 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.828356 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.829074 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.829404 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.851478 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwbn\" (UniqueName: \"kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn\") pod \"certified-operators-5xkx5\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:15 crc kubenswrapper[4835]: I0319 10:40:15.959448 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.032028 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8kcm"] Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.035782 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.051388 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8kcm"] Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.087579 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-utilities\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.087722 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kv2n\" (UniqueName: \"kubernetes.io/projected/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-kube-api-access-8kv2n\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.087822 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-catalog-content\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.190269 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-utilities\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.190420 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kv2n\" (UniqueName: \"kubernetes.io/projected/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-kube-api-access-8kv2n\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.190518 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-catalog-content\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.190925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-utilities\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.191050 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-catalog-content\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.210871 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kv2n\" (UniqueName: \"kubernetes.io/projected/2e9b0984-cab2-4aab-bad4-b4ad7040a40f-kube-api-access-8kv2n\") pod \"redhat-operators-n8kcm\" (UID: \"2e9b0984-cab2-4aab-bad4-b4ad7040a40f\") " pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.367901 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:18 crc kubenswrapper[4835]: E0319 10:40:18.809849 4835 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 10:40:18 crc kubenswrapper[4835]: E0319 10:40:18.815564 4835 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nphdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(9262f37d-2193-46c6-9706-5d039fa94926): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:40:18 crc kubenswrapper[4835]: E0319 10:40:18.816884 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="9262f37d-2193-46c6-9706-5d039fa94926" Mar 19 10:40:18 crc kubenswrapper[4835]: I0319 10:40:18.823971 4835 scope.go:117] "RemoveContainer" containerID="6608098a3e414c5c60a28b868776a69e3a4a543339b6e216e5857d2bff9065c8" Mar 19 10:40:19 crc kubenswrapper[4835]: I0319 10:40:19.744879 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerStarted","Data":"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f"} Mar 19 10:40:19 crc kubenswrapper[4835]: E0319 10:40:19.748113 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="9262f37d-2193-46c6-9706-5d039fa94926" Mar 19 10:40:19 crc kubenswrapper[4835]: I0319 10:40:19.771991 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k8j84" podStartSLOduration=2.9507493609999997 podStartE2EDuration="47.771971806s" podCreationTimestamp="2026-03-19 10:39:32 +0000 UTC" firstStartedPulling="2026-03-19 10:39:34.084306166 +0000 UTC m=+4628.932904753" lastFinishedPulling="2026-03-19 10:40:18.905528611 +0000 UTC m=+4673.754127198" observedRunningTime="2026-03-19 10:40:19.771795051 +0000 UTC m=+4674.620393658" watchObservedRunningTime="2026-03-19 10:40:19.771971806 +0000 UTC m=+4674.620570393" Mar 19 10:40:19 crc kubenswrapper[4835]: I0319 10:40:19.848442 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:19 crc kubenswrapper[4835]: I0319 10:40:19.967769 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8kcm"] Mar 19 10:40:19 crc kubenswrapper[4835]: I0319 10:40:19.982908 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-4mxsw"] Mar 19 10:40:20 crc kubenswrapper[4835]: W0319 10:40:20.284911 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e77069_d114_4d20_9b43_ad01b683c6eb.slice/crio-71bb6609b411082ad40200d5b5cdffb2b935bacfdc7c24ed860fcc4dfdc95bac WatchSource:0}: Error finding container 71bb6609b411082ad40200d5b5cdffb2b935bacfdc7c24ed860fcc4dfdc95bac: Status 404 returned error can't find the container with id 71bb6609b411082ad40200d5b5cdffb2b935bacfdc7c24ed860fcc4dfdc95bac Mar 19 10:40:20 crc kubenswrapper[4835]: W0319 10:40:20.287206 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9b0984_cab2_4aab_bad4_b4ad7040a40f.slice/crio-16fdc12fdd4ad3111cb2a7a1ab9ac427c235b3fbfdb9b4100b69c38ae38c43a5 WatchSource:0}: Error finding container 16fdc12fdd4ad3111cb2a7a1ab9ac427c235b3fbfdb9b4100b69c38ae38c43a5: Status 404 returned error can't find the container with id 16fdc12fdd4ad3111cb2a7a1ab9ac427c235b3fbfdb9b4100b69c38ae38c43a5 Mar 19 10:40:20 crc kubenswrapper[4835]: W0319 10:40:20.291644 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc08d060_225f_4f8d_bb5e_2de67700397b.slice/crio-3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f WatchSource:0}: Error finding container 3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f: Status 404 returned error can't find the container with id 3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.403605 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:40:20 crc kubenswrapper[4835]: E0319 10:40:20.403872 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.766356 4835 generic.go:334] "Generic (PLEG): container finished" podID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerID="3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24" exitCode=0 Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.767858 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerDied","Data":"3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24"} Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.767928 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerStarted","Data":"71bb6609b411082ad40200d5b5cdffb2b935bacfdc7c24ed860fcc4dfdc95bac"} Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.771794 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" event={"ID":"bc08d060-225f-4f8d-bb5e-2de67700397b","Type":"ContainerStarted","Data":"3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f"} Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.776297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerStarted","Data":"38fe584717fc542a79dc47d60420ce690d676b6d38da1f9dcb06ac245c24e0fc"} Mar 19 10:40:20 crc kubenswrapper[4835]: I0319 10:40:20.776349 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerStarted","Data":"16fdc12fdd4ad3111cb2a7a1ab9ac427c235b3fbfdb9b4100b69c38ae38c43a5"} Mar 19 10:40:21 crc kubenswrapper[4835]: I0319 10:40:21.794483 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerID="38fe584717fc542a79dc47d60420ce690d676b6d38da1f9dcb06ac245c24e0fc" exitCode=0 Mar 19 10:40:21 crc kubenswrapper[4835]: I0319 10:40:21.794593 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerDied","Data":"38fe584717fc542a79dc47d60420ce690d676b6d38da1f9dcb06ac245c24e0fc"} Mar 19 10:40:22 crc kubenswrapper[4835]: I0319 10:40:22.542537 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:22 crc kubenswrapper[4835]: I0319 10:40:22.542905 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:22 crc kubenswrapper[4835]: I0319 10:40:22.811472 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerStarted","Data":"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4"} Mar 19 10:40:23 crc kubenswrapper[4835]: I0319 10:40:23.639832 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-k8j84" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:23 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:23 crc kubenswrapper[4835]: > Mar 19 10:40:23 crc kubenswrapper[4835]: I0319 10:40:23.832220 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" event={"ID":"bc08d060-225f-4f8d-bb5e-2de67700397b","Type":"ContainerStarted","Data":"efa3020b05cafbbec5d1c9e7177b5f54c6ed4f29b4789d0c40dd5de521bb6817"} Mar 19 10:40:23 crc kubenswrapper[4835]: I0319 10:40:23.853720 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" podStartSLOduration=22.604171451 podStartE2EDuration="23.853690349s" podCreationTimestamp="2026-03-19 10:40:00 +0000 UTC" firstStartedPulling="2026-03-19 10:40:20.334890961 +0000 UTC m=+4675.183489568" lastFinishedPulling="2026-03-19 10:40:21.584409879 +0000 UTC m=+4676.433008466" observedRunningTime="2026-03-19 10:40:23.846812422 +0000 UTC m=+4678.695411019" watchObservedRunningTime="2026-03-19 10:40:23.853690349 +0000 UTC m=+4678.702288936" Mar 19 10:40:24 crc kubenswrapper[4835]: I0319 10:40:24.851019 4835 generic.go:334] "Generic (PLEG): container finished" podID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerID="6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4" exitCode=0 Mar 19 10:40:24 crc kubenswrapper[4835]: I0319 10:40:24.853282 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerDied","Data":"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4"} Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.867198 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerStarted","Data":"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f"} Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.871661 4835 generic.go:334] "Generic (PLEG): container finished" podID="bc08d060-225f-4f8d-bb5e-2de67700397b" containerID="efa3020b05cafbbec5d1c9e7177b5f54c6ed4f29b4789d0c40dd5de521bb6817" exitCode=0 Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.871717 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" event={"ID":"bc08d060-225f-4f8d-bb5e-2de67700397b","Type":"ContainerDied","Data":"efa3020b05cafbbec5d1c9e7177b5f54c6ed4f29b4789d0c40dd5de521bb6817"} Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.900806 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5xkx5" podStartSLOduration=6.393342451 podStartE2EDuration="10.900788132s" podCreationTimestamp="2026-03-19 10:40:15 +0000 UTC" firstStartedPulling="2026-03-19 10:40:20.771817895 +0000 UTC m=+4675.620416482" lastFinishedPulling="2026-03-19 10:40:25.279263576 +0000 UTC m=+4680.127862163" observedRunningTime="2026-03-19 10:40:25.890680235 +0000 UTC m=+4680.739278822" watchObservedRunningTime="2026-03-19 10:40:25.900788132 +0000 UTC m=+4680.749386719" Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.959566 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:25 crc kubenswrapper[4835]: I0319 10:40:25.959783 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:27 crc kubenswrapper[4835]: I0319 10:40:27.137235 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5xkx5" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:27 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:27 crc kubenswrapper[4835]: > Mar 19 10:40:31 crc kubenswrapper[4835]: I0319 10:40:31.934821 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" event={"ID":"bc08d060-225f-4f8d-bb5e-2de67700397b","Type":"ContainerDied","Data":"3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f"} Mar 19 10:40:31 crc kubenswrapper[4835]: I0319 10:40:31.935406 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7c0a1afc34c5406c625beb4cd4f1cd0f9cd32cae2b6d62ca79d29c1b03136f" Mar 19 10:40:32 crc kubenswrapper[4835]: I0319 10:40:32.768805 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:32 crc kubenswrapper[4835]: I0319 10:40:32.926432 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zvgk\" (UniqueName: \"kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk\") pod \"bc08d060-225f-4f8d-bb5e-2de67700397b\" (UID: \"bc08d060-225f-4f8d-bb5e-2de67700397b\") " Mar 19 10:40:32 crc kubenswrapper[4835]: I0319 10:40:32.939859 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk" (OuterVolumeSpecName: "kube-api-access-8zvgk") pod "bc08d060-225f-4f8d-bb5e-2de67700397b" (UID: "bc08d060-225f-4f8d-bb5e-2de67700397b"). InnerVolumeSpecName "kube-api-access-8zvgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:32 crc kubenswrapper[4835]: I0319 10:40:32.962317 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-4mxsw" Mar 19 10:40:32 crc kubenswrapper[4835]: I0319 10:40:32.962308 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerStarted","Data":"5926b549b9bd1674bc62b7fd20ce5ad013b9cddb379b4a0c561aa0a996f8532e"} Mar 19 10:40:33 crc kubenswrapper[4835]: I0319 10:40:33.032213 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zvgk\" (UniqueName: \"kubernetes.io/projected/bc08d060-225f-4f8d-bb5e-2de67700397b-kube-api-access-8zvgk\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:33 crc kubenswrapper[4835]: I0319 10:40:33.663545 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-k8j84" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:33 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:33 crc kubenswrapper[4835]: > Mar 19 10:40:33 crc kubenswrapper[4835]: I0319 10:40:33.858831 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-997fb"] Mar 19 10:40:33 crc kubenswrapper[4835]: I0319 10:40:33.871195 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-997fb"] Mar 19 10:40:33 crc kubenswrapper[4835]: I0319 10:40:33.991404 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 10:40:34 crc kubenswrapper[4835]: I0319 10:40:34.401949 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:40:34 crc kubenswrapper[4835]: E0319 10:40:34.402724 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:40:34 crc kubenswrapper[4835]: I0319 10:40:34.414857 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c518a4-f4f0-4238-add8-13fb7b7b6b08" path="/var/lib/kubelet/pods/93c518a4-f4f0-4238-add8-13fb7b7b6b08/volumes" Mar 19 10:40:34 crc kubenswrapper[4835]: E0319 10:40:34.736260 4835 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9b0984_cab2_4aab_bad4_b4ad7040a40f.slice/crio-conmon-5926b549b9bd1674bc62b7fd20ce5ad013b9cddb379b4a0c561aa0a996f8532e.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:40:34 crc kubenswrapper[4835]: I0319 10:40:34.993643 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerID="5926b549b9bd1674bc62b7fd20ce5ad013b9cddb379b4a0c561aa0a996f8532e" exitCode=0 Mar 19 10:40:34 crc kubenswrapper[4835]: I0319 10:40:34.993690 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerDied","Data":"5926b549b9bd1674bc62b7fd20ce5ad013b9cddb379b4a0c561aa0a996f8532e"} Mar 19 10:40:37 crc kubenswrapper[4835]: I0319 10:40:37.016063 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerStarted","Data":"38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420"} Mar 19 10:40:37 crc kubenswrapper[4835]: I0319 10:40:37.018346 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9262f37d-2193-46c6-9706-5d039fa94926","Type":"ContainerStarted","Data":"6a27cf424f6e34c5a4241e15982f732fa6b27d5b479b621269327434fe56f4f6"} Mar 19 10:40:37 crc kubenswrapper[4835]: I0319 10:40:37.029591 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5xkx5" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:37 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:37 crc kubenswrapper[4835]: > Mar 19 10:40:37 crc kubenswrapper[4835]: I0319 10:40:37.041316 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8kcm" podStartSLOduration=5.083474391 podStartE2EDuration="20.041296531s" podCreationTimestamp="2026-03-19 10:40:17 +0000 UTC" firstStartedPulling="2026-03-19 10:40:20.77931111 +0000 UTC m=+4675.627909697" lastFinishedPulling="2026-03-19 10:40:35.73713325 +0000 UTC m=+4690.585731837" observedRunningTime="2026-03-19 10:40:37.036079839 +0000 UTC m=+4691.884678446" watchObservedRunningTime="2026-03-19 10:40:37.041296531 +0000 UTC m=+4691.889895118" Mar 19 10:40:37 crc kubenswrapper[4835]: I0319 10:40:37.061471 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.658830748 podStartE2EDuration="1m3.061438751s" podCreationTimestamp="2026-03-19 10:39:34 +0000 UTC" firstStartedPulling="2026-03-19 10:39:36.585725953 +0000 UTC m=+4631.434324530" lastFinishedPulling="2026-03-19 10:40:33.988333946 +0000 UTC m=+4688.836932533" observedRunningTime="2026-03-19 10:40:37.057095042 +0000 UTC m=+4691.905693649" watchObservedRunningTime="2026-03-19 10:40:37.061438751 +0000 UTC m=+4691.910037338" Mar 19 10:40:38 crc kubenswrapper[4835]: I0319 10:40:38.368982 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:38 crc kubenswrapper[4835]: I0319 10:40:38.369623 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:40:39 crc kubenswrapper[4835]: I0319 10:40:39.438849 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:39 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:39 crc kubenswrapper[4835]: > Mar 19 10:40:42 crc kubenswrapper[4835]: I0319 10:40:42.623167 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:42 crc kubenswrapper[4835]: I0319 10:40:42.674188 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:42 crc kubenswrapper[4835]: I0319 10:40:42.865018 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.095296 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k8j84" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" containerID="cri-o://9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f" gracePeriod=2 Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.710857 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.860363 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities\") pod \"37b41589-6db2-46c8-b957-8be6bf5ba11f\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.860417 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzzjt\" (UniqueName: \"kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt\") pod \"37b41589-6db2-46c8-b957-8be6bf5ba11f\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.860465 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content\") pod \"37b41589-6db2-46c8-b957-8be6bf5ba11f\" (UID: \"37b41589-6db2-46c8-b957-8be6bf5ba11f\") " Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.861474 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities" (OuterVolumeSpecName: "utilities") pod "37b41589-6db2-46c8-b957-8be6bf5ba11f" (UID: "37b41589-6db2-46c8-b957-8be6bf5ba11f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.862373 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.868978 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt" (OuterVolumeSpecName: "kube-api-access-vzzjt") pod "37b41589-6db2-46c8-b957-8be6bf5ba11f" (UID: "37b41589-6db2-46c8-b957-8be6bf5ba11f"). InnerVolumeSpecName "kube-api-access-vzzjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.939806 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37b41589-6db2-46c8-b957-8be6bf5ba11f" (UID: "37b41589-6db2-46c8-b957-8be6bf5ba11f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.964313 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzzjt\" (UniqueName: \"kubernetes.io/projected/37b41589-6db2-46c8-b957-8be6bf5ba11f-kube-api-access-vzzjt\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:44 crc kubenswrapper[4835]: I0319 10:40:44.964361 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b41589-6db2-46c8-b957-8be6bf5ba11f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.112277 4835 generic.go:334] "Generic (PLEG): container finished" podID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerID="9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f" exitCode=0 Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.112355 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerDied","Data":"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f"} Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.112363 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k8j84" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.112643 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k8j84" event={"ID":"37b41589-6db2-46c8-b957-8be6bf5ba11f","Type":"ContainerDied","Data":"5a49b83334c5f7e8aeec03e3eddde2c61026b1837fc5568d07b445810b121994"} Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.112670 4835 scope.go:117] "RemoveContainer" containerID="9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.149940 4835 scope.go:117] "RemoveContainer" containerID="f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.158545 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.169244 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k8j84"] Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.178676 4835 scope.go:117] "RemoveContainer" containerID="298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.242029 4835 scope.go:117] "RemoveContainer" containerID="9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f" Mar 19 10:40:45 crc kubenswrapper[4835]: E0319 10:40:45.242465 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f\": container with ID starting with 9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f not found: ID does not exist" containerID="9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.242531 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f"} err="failed to get container status \"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f\": rpc error: code = NotFound desc = could not find container \"9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f\": container with ID starting with 9f7f4fce486d192008f3b3e360d1a2f525b1e8411ac78c95143dc46e14c3789f not found: ID does not exist" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.242560 4835 scope.go:117] "RemoveContainer" containerID="f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb" Mar 19 10:40:45 crc kubenswrapper[4835]: E0319 10:40:45.242908 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb\": container with ID starting with f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb not found: ID does not exist" containerID="f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.242939 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb"} err="failed to get container status \"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb\": rpc error: code = NotFound desc = could not find container \"f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb\": container with ID starting with f0b6ba0b7615bd0e0675cac8810f81a031fb2768ffc5135ef0199580b993b6bb not found: ID does not exist" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.242960 4835 scope.go:117] "RemoveContainer" containerID="298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a" Mar 19 10:40:45 crc kubenswrapper[4835]: E0319 10:40:45.243588 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a\": container with ID starting with 298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a not found: ID does not exist" containerID="298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a" Mar 19 10:40:45 crc kubenswrapper[4835]: I0319 10:40:45.243616 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a"} err="failed to get container status \"298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a\": rpc error: code = NotFound desc = could not find container \"298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a\": container with ID starting with 298455861df274251d288ee3c9c50c931be2f9ed4ee38afa1fe336342ee1514a not found: ID does not exist" Mar 19 10:40:46 crc kubenswrapper[4835]: I0319 10:40:46.018266 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:46 crc kubenswrapper[4835]: I0319 10:40:46.079949 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:46 crc kubenswrapper[4835]: I0319 10:40:46.417935 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" path="/var/lib/kubelet/pods/37b41589-6db2-46c8-b957-8be6bf5ba11f/volumes" Mar 19 10:40:47 crc kubenswrapper[4835]: I0319 10:40:47.401969 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:40:48 crc kubenswrapper[4835]: I0319 10:40:48.158115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350"} Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.071576 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.073115 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5xkx5" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" containerID="cri-o://4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f" gracePeriod=2 Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.688765 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.789860 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities\") pod \"19e77069-d114-4d20-9b43-ad01b683c6eb\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.789998 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content\") pod \"19e77069-d114-4d20-9b43-ad01b683c6eb\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.790025 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnwbn\" (UniqueName: \"kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn\") pod \"19e77069-d114-4d20-9b43-ad01b683c6eb\" (UID: \"19e77069-d114-4d20-9b43-ad01b683c6eb\") " Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.790507 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities" (OuterVolumeSpecName: "utilities") pod "19e77069-d114-4d20-9b43-ad01b683c6eb" (UID: "19e77069-d114-4d20-9b43-ad01b683c6eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.791076 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.795471 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn" (OuterVolumeSpecName: "kube-api-access-tnwbn") pod "19e77069-d114-4d20-9b43-ad01b683c6eb" (UID: "19e77069-d114-4d20-9b43-ad01b683c6eb"). InnerVolumeSpecName "kube-api-access-tnwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.831641 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:49 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:49 crc kubenswrapper[4835]: > Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.850656 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19e77069-d114-4d20-9b43-ad01b683c6eb" (UID: "19e77069-d114-4d20-9b43-ad01b683c6eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.904308 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19e77069-d114-4d20-9b43-ad01b683c6eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:49 crc kubenswrapper[4835]: I0319 10:40:49.904355 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnwbn\" (UniqueName: \"kubernetes.io/projected/19e77069-d114-4d20-9b43-ad01b683c6eb-kube-api-access-tnwbn\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.190732 4835 generic.go:334] "Generic (PLEG): container finished" podID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerID="4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f" exitCode=0 Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.191160 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerDied","Data":"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f"} Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.191192 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xkx5" event={"ID":"19e77069-d114-4d20-9b43-ad01b683c6eb","Type":"ContainerDied","Data":"71bb6609b411082ad40200d5b5cdffb2b935bacfdc7c24ed860fcc4dfdc95bac"} Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.191221 4835 scope.go:117] "RemoveContainer" containerID="4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.191448 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xkx5" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.248079 4835 scope.go:117] "RemoveContainer" containerID="6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.253150 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.269299 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5xkx5"] Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.415825 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" path="/var/lib/kubelet/pods/19e77069-d114-4d20-9b43-ad01b683c6eb/volumes" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.792532 4835 scope.go:117] "RemoveContainer" containerID="3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.857297 4835 scope.go:117] "RemoveContainer" containerID="4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f" Mar 19 10:40:50 crc kubenswrapper[4835]: E0319 10:40:50.858344 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f\": container with ID starting with 4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f not found: ID does not exist" containerID="4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.858396 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f"} err="failed to get container status \"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f\": rpc error: code = NotFound desc = could not find container \"4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f\": container with ID starting with 4f39b49ae1f93bf2787a7c31df600e0763ba7d5e223bb891d05d2efe35bfa66f not found: ID does not exist" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.858427 4835 scope.go:117] "RemoveContainer" containerID="6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4" Mar 19 10:40:50 crc kubenswrapper[4835]: E0319 10:40:50.859964 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4\": container with ID starting with 6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4 not found: ID does not exist" containerID="6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.860009 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4"} err="failed to get container status \"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4\": rpc error: code = NotFound desc = could not find container \"6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4\": container with ID starting with 6bdcf383e0b3e9e84e618392ab4c1fc1b3181532b9c189a8ff8d238760b63ca4 not found: ID does not exist" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.860036 4835 scope.go:117] "RemoveContainer" containerID="3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24" Mar 19 10:40:50 crc kubenswrapper[4835]: E0319 10:40:50.860549 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24\": container with ID starting with 3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24 not found: ID does not exist" containerID="3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24" Mar 19 10:40:50 crc kubenswrapper[4835]: I0319 10:40:50.860572 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24"} err="failed to get container status \"3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24\": rpc error: code = NotFound desc = could not find container \"3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24\": container with ID starting with 3b629df6edb9045b0fdc296014c2c3b2585258d364475c6331c83517897a5d24 not found: ID does not exist" Mar 19 10:40:59 crc kubenswrapper[4835]: I0319 10:40:59.424666 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:40:59 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:40:59 crc kubenswrapper[4835]: > Mar 19 10:41:08 crc kubenswrapper[4835]: I0319 10:41:08.451286 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:41:08 crc kubenswrapper[4835]: I0319 10:41:08.506721 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:41:08 crc kubenswrapper[4835]: I0319 10:41:08.603068 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8kcm"] Mar 19 10:41:08 crc kubenswrapper[4835]: I0319 10:41:08.694203 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 10:41:08 crc kubenswrapper[4835]: I0319 10:41:08.694479 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppf8q" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="registry-server" containerID="cri-o://989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539" gracePeriod=2 Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.326957 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.414139 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-966dh\" (UniqueName: \"kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh\") pod \"55cc73b7-e015-4881-81d5-075426bff93c\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.414521 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities\") pod \"55cc73b7-e015-4881-81d5-075426bff93c\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.414775 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content\") pod \"55cc73b7-e015-4881-81d5-075426bff93c\" (UID: \"55cc73b7-e015-4881-81d5-075426bff93c\") " Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.421804 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities" (OuterVolumeSpecName: "utilities") pod "55cc73b7-e015-4881-81d5-075426bff93c" (UID: "55cc73b7-e015-4881-81d5-075426bff93c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.426362 4835 generic.go:334] "Generic (PLEG): container finished" podID="55cc73b7-e015-4881-81d5-075426bff93c" containerID="989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539" exitCode=0 Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.427530 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppf8q" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.428151 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerDied","Data":"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539"} Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.428179 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppf8q" event={"ID":"55cc73b7-e015-4881-81d5-075426bff93c","Type":"ContainerDied","Data":"399ca121cf93a0c958e61f834604e7ec6d1c4989940d2fdb7d934c632bb0e879"} Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.428194 4835 scope.go:117] "RemoveContainer" containerID="989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.467955 4835 scope.go:117] "RemoveContainer" containerID="7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.519550 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.589908 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55cc73b7-e015-4881-81d5-075426bff93c" (UID: "55cc73b7-e015-4881-81d5-075426bff93c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:41:09 crc kubenswrapper[4835]: I0319 10:41:09.623459 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55cc73b7-e015-4881-81d5-075426bff93c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.087509 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh" (OuterVolumeSpecName: "kube-api-access-966dh") pod "55cc73b7-e015-4881-81d5-075426bff93c" (UID: "55cc73b7-e015-4881-81d5-075426bff93c"). InnerVolumeSpecName "kube-api-access-966dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.120542 4835 scope.go:117] "RemoveContainer" containerID="1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.134145 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-966dh\" (UniqueName: \"kubernetes.io/projected/55cc73b7-e015-4881-81d5-075426bff93c-kube-api-access-966dh\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.267891 4835 scope.go:117] "RemoveContainer" containerID="989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539" Mar 19 10:41:10 crc kubenswrapper[4835]: E0319 10:41:10.269057 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539\": container with ID starting with 989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539 not found: ID does not exist" containerID="989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.269110 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539"} err="failed to get container status \"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539\": rpc error: code = NotFound desc = could not find container \"989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539\": container with ID starting with 989c26fc9120435ba010b6032af3a5230268c89e4501e0eb8a69f986220c5539 not found: ID does not exist" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.269137 4835 scope.go:117] "RemoveContainer" containerID="7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7" Mar 19 10:41:10 crc kubenswrapper[4835]: E0319 10:41:10.269534 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7\": container with ID starting with 7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7 not found: ID does not exist" containerID="7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.269578 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7"} err="failed to get container status \"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7\": rpc error: code = NotFound desc = could not find container \"7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7\": container with ID starting with 7bf5d95bdcbed49c39fa38bd773725bcc4ae8485ea078ff4a1ef3716ff003de7 not found: ID does not exist" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.269606 4835 scope.go:117] "RemoveContainer" containerID="1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224" Mar 19 10:41:10 crc kubenswrapper[4835]: E0319 10:41:10.272327 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224\": container with ID starting with 1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224 not found: ID does not exist" containerID="1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.272363 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224"} err="failed to get container status \"1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224\": rpc error: code = NotFound desc = could not find container \"1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224\": container with ID starting with 1290a5f0fc30aa2150895371660b1417fdf1c53bd9293697880523528bb37224 not found: ID does not exist" Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.373522 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.387490 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppf8q"] Mar 19 10:41:10 crc kubenswrapper[4835]: I0319 10:41:10.414755 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cc73b7-e015-4881-81d5-075426bff93c" path="/var/lib/kubelet/pods/55cc73b7-e015-4881-81d5-075426bff93c/volumes" Mar 19 10:41:19 crc kubenswrapper[4835]: I0319 10:41:19.046301 4835 scope.go:117] "RemoveContainer" containerID="1ffebf5a33426282c076f871af7b30a206b21811e9c560ec37a548272d4915bd" Mar 19 10:41:19 crc kubenswrapper[4835]: I0319 10:41:19.103465 4835 scope.go:117] "RemoveContainer" containerID="e2472834fbcf2cb21e367e671de40b2cd7c30d7b5caa60d7db218a02d0979136" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.540295 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565282-llqff"] Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.550727 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.550790 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551370 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551382 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551402 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551408 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551423 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551429 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551442 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551448 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="extract-utilities" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551476 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551482 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551501 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551506 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="extract-content" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551517 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc08d060-225f-4f8d-bb5e-2de67700397b" containerName="oc" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551531 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc08d060-225f-4f8d-bb5e-2de67700397b" containerName="oc" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551544 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551550 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: E0319 10:42:00.551571 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.551577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.553202 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e77069-d114-4d20-9b43-ad01b683c6eb" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.553234 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc08d060-225f-4f8d-bb5e-2de67700397b" containerName="oc" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.553245 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cc73b7-e015-4881-81d5-075426bff93c" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.553264 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b41589-6db2-46c8-b957-8be6bf5ba11f" containerName="registry-server" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.569094 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.587561 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.587563 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.587576 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.709255 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-llqff"] Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.742087 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wns2q\" (UniqueName: \"kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q\") pod \"auto-csr-approver-29565282-llqff\" (UID: \"7551a433-d892-46ce-addb-cbeb079c9ad3\") " pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.843835 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wns2q\" (UniqueName: \"kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q\") pod \"auto-csr-approver-29565282-llqff\" (UID: \"7551a433-d892-46ce-addb-cbeb079c9ad3\") " pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.902149 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wns2q\" (UniqueName: \"kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q\") pod \"auto-csr-approver-29565282-llqff\" (UID: \"7551a433-d892-46ce-addb-cbeb079c9ad3\") " pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:00 crc kubenswrapper[4835]: I0319 10:42:00.951987 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:03 crc kubenswrapper[4835]: I0319 10:42:03.019935 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-llqff"] Mar 19 10:42:04 crc kubenswrapper[4835]: I0319 10:42:04.040371 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-llqff" event={"ID":"7551a433-d892-46ce-addb-cbeb079c9ad3","Type":"ContainerStarted","Data":"1ac6cf2d376ba4b6283790e96cce341c39883898bd0d1bd6dd897b5bbee1a4fd"} Mar 19 10:42:07 crc kubenswrapper[4835]: I0319 10:42:07.075209 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-llqff" event={"ID":"7551a433-d892-46ce-addb-cbeb079c9ad3","Type":"ContainerStarted","Data":"0dbd4ad611789d462a3c4ce7e7ea46855d6928423d6f8d7ef03bbd47852ba1c3"} Mar 19 10:42:07 crc kubenswrapper[4835]: I0319 10:42:07.110441 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565282-llqff" podStartSLOduration=4.765466702 podStartE2EDuration="7.10641581s" podCreationTimestamp="2026-03-19 10:42:00 +0000 UTC" firstStartedPulling="2026-03-19 10:42:03.111810865 +0000 UTC m=+4777.960409442" lastFinishedPulling="2026-03-19 10:42:05.452759953 +0000 UTC m=+4780.301358550" observedRunningTime="2026-03-19 10:42:07.102824952 +0000 UTC m=+4781.951423559" watchObservedRunningTime="2026-03-19 10:42:07.10641581 +0000 UTC m=+4781.955014397" Mar 19 10:42:09 crc kubenswrapper[4835]: I0319 10:42:09.099093 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-llqff" event={"ID":"7551a433-d892-46ce-addb-cbeb079c9ad3","Type":"ContainerDied","Data":"0dbd4ad611789d462a3c4ce7e7ea46855d6928423d6f8d7ef03bbd47852ba1c3"} Mar 19 10:42:09 crc kubenswrapper[4835]: I0319 10:42:09.100075 4835 generic.go:334] "Generic (PLEG): container finished" podID="7551a433-d892-46ce-addb-cbeb079c9ad3" containerID="0dbd4ad611789d462a3c4ce7e7ea46855d6928423d6f8d7ef03bbd47852ba1c3" exitCode=0 Mar 19 10:42:11 crc kubenswrapper[4835]: I0319 10:42:11.412583 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:11 crc kubenswrapper[4835]: I0319 10:42:11.543223 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wns2q\" (UniqueName: \"kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q\") pod \"7551a433-d892-46ce-addb-cbeb079c9ad3\" (UID: \"7551a433-d892-46ce-addb-cbeb079c9ad3\") " Mar 19 10:42:11 crc kubenswrapper[4835]: I0319 10:42:11.584874 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q" (OuterVolumeSpecName: "kube-api-access-wns2q") pod "7551a433-d892-46ce-addb-cbeb079c9ad3" (UID: "7551a433-d892-46ce-addb-cbeb079c9ad3"). InnerVolumeSpecName "kube-api-access-wns2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:11 crc kubenswrapper[4835]: I0319 10:42:11.645886 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wns2q\" (UniqueName: \"kubernetes.io/projected/7551a433-d892-46ce-addb-cbeb079c9ad3-kube-api-access-wns2q\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:12 crc kubenswrapper[4835]: I0319 10:42:12.132848 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-llqff" event={"ID":"7551a433-d892-46ce-addb-cbeb079c9ad3","Type":"ContainerDied","Data":"1ac6cf2d376ba4b6283790e96cce341c39883898bd0d1bd6dd897b5bbee1a4fd"} Mar 19 10:42:12 crc kubenswrapper[4835]: I0319 10:42:12.133111 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac6cf2d376ba4b6283790e96cce341c39883898bd0d1bd6dd897b5bbee1a4fd" Mar 19 10:42:12 crc kubenswrapper[4835]: I0319 10:42:12.132937 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-llqff" Mar 19 10:42:12 crc kubenswrapper[4835]: I0319 10:42:12.687295 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-c962s"] Mar 19 10:42:12 crc kubenswrapper[4835]: I0319 10:42:12.698708 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-c962s"] Mar 19 10:42:14 crc kubenswrapper[4835]: I0319 10:42:14.426079 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f42988-08ec-4ebe-941f-c1984a620a40" path="/var/lib/kubelet/pods/c0f42988-08ec-4ebe-941f-c1984a620a40/volumes" Mar 19 10:42:19 crc kubenswrapper[4835]: I0319 10:42:19.367517 4835 scope.go:117] "RemoveContainer" containerID="1346ffcb5f9d90af9c6a1c3eed1b17581df6108357cb93f2423f2e8665e87bdc" Mar 19 10:42:47 crc kubenswrapper[4835]: I0319 10:42:47.641607 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:42:47 crc kubenswrapper[4835]: I0319 10:42:47.641608 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:42:47 crc kubenswrapper[4835]: I0319 10:42:47.645476 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:42:47 crc kubenswrapper[4835]: I0319 10:42:47.645475 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:00 crc kubenswrapper[4835]: I0319 10:43:00.798220 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:01 crc kubenswrapper[4835]: I0319 10:43:01.797186 4835 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-zpbgp container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:01 crc kubenswrapper[4835]: I0319 10:43:01.803329 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podUID="a5ae9a03-b125-4072-9ec8-fddd19b7002e" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:02 crc kubenswrapper[4835]: I0319 10:43:02.075858 4835 trace.go:236] Trace[696132409]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (19-Mar-2026 10:43:00.098) (total time: 1973ms): Mar 19 10:43:02 crc kubenswrapper[4835]: Trace[696132409]: [1.973232794s] [1.973232794s] END Mar 19 10:43:02 crc kubenswrapper[4835]: I0319 10:43:02.134869 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:02 crc kubenswrapper[4835]: I0319 10:43:02.135148 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:02 crc kubenswrapper[4835]: I0319 10:43:02.134871 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:02 crc kubenswrapper[4835]: I0319 10:43:02.135321 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:04 crc kubenswrapper[4835]: I0319 10:43:04.128250 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:04 crc kubenswrapper[4835]: I0319 10:43:04.128652 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:05 crc kubenswrapper[4835]: I0319 10:43:05.515960 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:06 crc kubenswrapper[4835]: I0319 10:43:06.422507 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:43:06 crc kubenswrapper[4835]: I0319 10:43:06.422931 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:43:06 crc kubenswrapper[4835]: I0319 10:43:06.573833 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:06 crc kubenswrapper[4835]: I0319 10:43:06.574210 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.553325 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.553325 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.553707 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.553762 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.641929 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.641974 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.641996 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:07 crc kubenswrapper[4835]: I0319 10:43:07.642018 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:08 crc kubenswrapper[4835]: I0319 10:43:08.536306 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:08 crc kubenswrapper[4835]: I0319 10:43:08.536387 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:08 crc kubenswrapper[4835]: I0319 10:43:08.606348 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:08 crc kubenswrapper[4835]: I0319 10:43:08.606420 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:09 crc kubenswrapper[4835]: I0319 10:43:09.381156 4835 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-qcvsc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:09 crc kubenswrapper[4835]: I0319 10:43:09.381650 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podUID="83538569-9d57-4fdb-83b2-03dbd62bad4d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.035982 4835 trace.go:236] Trace[1329509346]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (19-Mar-2026 10:43:12.671) (total time: 2346ms): Mar 19 10:43:15 crc kubenswrapper[4835]: Trace[1329509346]: [2.346203532s] [2.346203532s] END Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.035976 4835 trace.go:236] Trace[1522368780]: "Calculate volume metrics of ovnkube-identity-cm for pod openshift-network-node-identity/network-node-identity-vrzqb" (19-Mar-2026 10:43:13.974) (total time: 1045ms): Mar 19 10:43:15 crc kubenswrapper[4835]: Trace[1522368780]: [1.045827264s] [1.045827264s] END Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.088136 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.118204 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.169512 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" podUID="e86ab409-7643-4e6f-9129-6f90e5b6bf1c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.211024 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.211479 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.296383 4835 patch_prober.go:28] interesting pod/monitoring-plugin-7b6f7975cf-xd4n6 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.296794 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podUID="5573de0d-e8de-4c32-b778-1cf95556c219" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.296453 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.297418 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.380085 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.380507 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.380927 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.381092 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:15 crc kubenswrapper[4835]: I0319 10:43:15.381015 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" podUID="637282e5-c56d-49ea-ac96-f299fb3661f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.239914 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.239972 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.239967 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.280927 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.283551 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.283589 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364293 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364328 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364416 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364496 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364542 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.364894 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:16 crc kubenswrapper[4835]: I0319 10:43:16.365511 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.286397 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.386041 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.386574 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.387116 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.387191 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.387545 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:17 crc kubenswrapper[4835]: > Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.481114 4835 patch_prober.go:28] interesting pod/console-5d9fcbf4f8-5btdf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.481172 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5d9fcbf4f8-5btdf" podUID="d524b5e2-97d1-47cc-8474-113fa8e6016a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.634257 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.634335 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.634261 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.634416 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.716157 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.716221 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.716497 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:17 crc kubenswrapper[4835]: I0319 10:43:17.716565 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.066185 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:18 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:18 crc kubenswrapper[4835]: > Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.070702 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:18 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:18 crc kubenswrapper[4835]: > Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.305201 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.305271 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.305701 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.307645 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.637930 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:18 crc kubenswrapper[4835]: I0319 10:43:18.637947 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:19 crc kubenswrapper[4835]: I0319 10:43:19.845592 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:19 crc kubenswrapper[4835]: I0319 10:43:19.845662 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:19 crc kubenswrapper[4835]: I0319 10:43:19.846828 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:19 crc kubenswrapper[4835]: I0319 10:43:19.846894 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.091896 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.091907 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.092209 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.092260 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.384525 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.384549 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.30:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.384622 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.384632 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.722288 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.722382 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.786617 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f2687a33-8ba7-4a21-9906-84de49945433" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 10:43:20 crc kubenswrapper[4835]: I0319 10:43:20.791990 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.084819 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:21 crc kubenswrapper[4835]: timeout: health rpc did not complete within 1s Mar 19 10:43:21 crc kubenswrapper[4835]: > Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.085222 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:21 crc kubenswrapper[4835]: timeout: health rpc did not complete within 1s Mar 19 10:43:21 crc kubenswrapper[4835]: > Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.090269 4835 trace.go:236] Trace[447232934]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (19-Mar-2026 10:43:18.212) (total time: 2870ms): Mar 19 10:43:21 crc kubenswrapper[4835]: Trace[447232934]: [2.870888143s] [2.870888143s] END Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.585044 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.585446 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.585512 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.585527 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.791148 4835 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-zpbgp container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:21 crc kubenswrapper[4835]: I0319 10:43:21.791229 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podUID="a5ae9a03-b125-4072-9ec8-fddd19b7002e" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.012029 4835 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-djtzl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.012463 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" podUID="7490a09e-a8be-4931-a282-38989ba640b3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.133842 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.133923 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.133961 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.133914 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288058 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288122 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288165 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288215 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288228 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288277 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288300 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.288293 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.575918 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.575978 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.575923 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.576084 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.576102 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.576128 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.786698 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.788256 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.837931 4835 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-6p9l9 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.837973 4835 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-6p9l9 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.837992 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" podUID="04d6a038-d68f-4f5a-ad4d-95cd686d1d24" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.838044 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" podUID="04d6a038-d68f-4f5a-ad4d-95cd686d1d24" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.928977 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.929045 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.929064 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:22 crc kubenswrapper[4835]: I0319 10:43:22.929111 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.217647 4835 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-rznlv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.217712 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podUID="82e92071-6291-4ff2-971a-c658d2e001ed" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.372086 4835 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-cm5jv container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.372161 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podUID="9b90ec5a-5d56-4267-bb4e-1fbcdecff021" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.436465 4835 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.436523 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podUID="e0bc0a8b-4a2d-4460-b984-7a2279e1a424" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.535685 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.535710 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.535780 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.535845 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.607059 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.607113 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.607172 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.607117 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.782517 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.782829 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:23 crc kubenswrapper[4835]: I0319 10:43:23.993937 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.076978 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.077559 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.077766 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.129129 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.129174 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.345975 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.369087 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.369170 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.437392 4835 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.437453 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="a0a07ea9-b7e5-4679-9b63-c6027e42f279" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.502030 4835 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.502105 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="b54655c1-accb-4df5-98b0-f01dbcfe83f7" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.548620 4835 patch_prober.go:28] interesting pod/monitoring-plugin-7b6f7975cf-xd4n6 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.548673 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podUID="5573de0d-e8de-4c32-b778-1cf95556c219" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.662956 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.715926 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.808101 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.861989 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:24 crc kubenswrapper[4835]: I0319 10:43:24.952225 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.013966 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.127081 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.168029 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.292013 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.373964 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.377137 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.449667 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.449793 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.450035 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.450119 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.514924 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.680994 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.681038 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.681459 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.681474 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.683334 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.687203 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"ae03816819220c6edb9f8e7928abc6ed18f2726bdbc02970897abc5dcd4c3ba3"} pod="metallb-system/frr-k8s-44rxf" containerMessage="Container frr failed liveness probe, will be restarted" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.688799 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" containerID="cri-o://ae03816819220c6edb9f8e7928abc6ed18f2726bdbc02970897abc5dcd4c3ba3" gracePeriod=2 Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.722320 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.722583 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:25 crc kubenswrapper[4835]: I0319 10:43:25.782735 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f2687a33-8ba7-4a21-9906-84de49945433" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 10:43:26 crc kubenswrapper[4835]: I0319 10:43:26.661884 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:26 crc kubenswrapper[4835]: I0319 10:43:26.783455 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-69n76" podUID="86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.196650 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerDied","Data":"ae03816819220c6edb9f8e7928abc6ed18f2726bdbc02970897abc5dcd4c3ba3"} Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.196992 4835 generic.go:334] "Generic (PLEG): container finished" podID="8173c77e-48ec-44fc-9be7-67381528f78a" containerID="ae03816819220c6edb9f8e7928abc6ed18f2726bdbc02970897abc5dcd4c3ba3" exitCode=143 Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.483782 4835 patch_prober.go:28] interesting pod/console-5d9fcbf4f8-5btdf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.483853 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5d9fcbf4f8-5btdf" podUID="d524b5e2-97d1-47cc-8474-113fa8e6016a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.573498 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" podUID="981b2eee-0205-4314-b31a-d01123049b7b" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.643415 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.643526 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.643686 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.651943 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.652018 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.652079 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.654898 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.654957 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.655006 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.654907 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.655071 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.661979 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"d45f6fbc2f56800ba2896675356be0ba819dfd329f545ce7adcf9208ce4bacd8"} pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.662043 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" containerID="cri-o://d45f6fbc2f56800ba2896675356be0ba819dfd329f545ce7adcf9208ce4bacd8" gracePeriod=30 Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.778856 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:27 crc kubenswrapper[4835]: I0319 10:43:27.781910 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.216083 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"4835d12e1d2bb983d920ee435b8065501df1842902efa2c1dcccd8b0e3ab8339"} pod="openshift-authentication/oauth-openshift-766799bf97-q8249" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.216603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"b4dafe9c8dd6871fe8cf5aac227eb085228e1a81cb66b83eb25f60171efa3258"} Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.261133 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.261250 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.479368 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.480676 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.481856 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.482337 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.482802 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.484468 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:28 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:28 crc kubenswrapper[4835]: > Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.536398 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.536447 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.536488 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.536508 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.606188 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.606263 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.607318 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.607377 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.636996 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.637152 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.684933 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.684996 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.783325 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.783686 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.845837 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.845954 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.845889 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.846052 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.850174 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.850295 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:28 crc kubenswrapper[4835]: I0319 10:43:28.967058 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" podUID="3cb1a290-5e83-4b84-80ab-a7d86e85ce98" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.35:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.344205 4835 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-qcvsc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.344233 4835 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-qcvsc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.344279 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podUID="83538569-9d57-4fdb-83b2-03dbd62bad4d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.344290 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podUID="83538569-9d57-4fdb-83b2-03dbd62bad4d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.475211 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.686516 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:29 crc kubenswrapper[4835]: > Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.686627 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:29 crc kubenswrapper[4835]: > Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.690082 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:29 crc kubenswrapper[4835]: I0319 10:43:29.690140 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.092957 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.093038 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.093226 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.093442 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.394994 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.399222 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.517167 4835 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.722200 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.722253 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.722425 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.83:9091/-/healthy\": context deadline exceeded" start-of-body= Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.722503 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/healthy\": context deadline exceeded" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.784571 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f2687a33-8ba7-4a21-9906-84de49945433" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.833970 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:30 crc kubenswrapper[4835]: I0319 10:43:30.834548 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.248709 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" event={"ID":"ef156d19-8841-4be5-a739-bd07a7789ea3","Type":"ContainerDied","Data":"d45f6fbc2f56800ba2896675356be0ba819dfd329f545ce7adcf9208ce4bacd8"} Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.249425 4835 generic.go:334] "Generic (PLEG): container finished" podID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerID="d45f6fbc2f56800ba2896675356be0ba819dfd329f545ce7adcf9208ce4bacd8" exitCode=0 Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.289171 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" podUID="8a05d0dc-7092-45aa-b869-34a23cb0a1f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.289399 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" podUID="8a05d0dc-7092-45aa-b869-34a23cb0a1f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.583944 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.584290 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.584002 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.584435 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.783302 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f2687a33-8ba7-4a21-9906-84de49945433" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.783453 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.789071 4835 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-zpbgp container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.789137 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podUID="a5ae9a03-b125-4072-9ec8-fddd19b7002e" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.790440 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"110aa42509437eb4825da61f2dedc7f9dc1d657aa826faf30fd91d9e8f4a59d1"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.790547 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f2687a33-8ba7-4a21-9906-84de49945433" containerName="ceilometer-central-agent" containerID="cri-o://110aa42509437eb4825da61f2dedc7f9dc1d657aa826faf30fd91d9e8f4a59d1" gracePeriod=30 Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.927968 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.928340 4835 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vrd5p container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.928391 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:31 crc kubenswrapper[4835]: I0319 10:43:31.928526 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrd5p" podUID="b9eb63f6-7fbc-4f82-946b-8844751bb402" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.011988 4835 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-djtzl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.012048 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" podUID="7490a09e-a8be-4931-a282-38989ba640b3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.127937 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q886g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.127991 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podUID="6d8b5f04-3b9c-4656-bf70-203d508949a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.128162 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q886g container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.128233 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podUID="6d8b5f04-3b9c-4656-bf70-203d508949a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.134883 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.134949 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.134883 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.135002 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.223765 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Liveness probe status=failure output="" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.257193 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339169 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339174 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339245 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339299 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339194 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.339355 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.340185 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.340213 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.494701 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.494964 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.495096 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.495115 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.572227 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.572301 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.779259 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.781950 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.781994 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:32 crc kubenswrapper[4835]: I0319 10:43:32.782051 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.216637 4835 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-rznlv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.217001 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podUID="82e92071-6291-4ff2-971a-c658d2e001ed" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.293501 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" event={"ID":"ef156d19-8841-4be5-a739-bd07a7789ea3","Type":"ContainerStarted","Data":"ff96ff367002ac74485d333f41d51b7cdd9ee7f9873ca34652d63d914844aa4b"} Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.294243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.294504 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.294551 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.375402 4835 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-cm5jv container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.375453 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podUID="9b90ec5a-5d56-4267-bb4e-1fbcdecff021" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.436551 4835 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.436629 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podUID="e0bc0a8b-4a2d-4460-b984-7a2279e1a424" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.536154 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.536207 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.605468 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.605528 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.681026 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" podUID="2a19bff3-1be0-44d7-b625-df3e46afa290" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.778628 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.779244 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.952049 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4835]: I0319 10:43:33.993001 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.033975 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.087078 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.087140 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.087247 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.087334 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.166116 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" podUID="e01500c9-ffdf-42df-9017-26b813efed0e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.344067 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.384906 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.385314 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.385378 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.473913 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" podUID="6dabbeda-38a8-4289-92b6-075d7c20f958" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.473919 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" podUID="6dabbeda-38a8-4289-92b6-075d7c20f958" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.587055 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" podUID="5db66af1-4929-4163-b788-0320b0323a79" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.587108 4835 patch_prober.go:28] interesting pod/monitoring-plugin-7b6f7975cf-xd4n6 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.587164 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podUID="5573de0d-e8de-4c32-b778-1cf95556c219" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.587241 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.587283 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" podUID="5db66af1-4929-4163-b788-0320b0323a79" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.704954 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.704997 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.870006 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podUID="0f2be57b-1212-4dc2-b772-3317bcd69aac" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.870033 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.952931 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.953019 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:34 crc kubenswrapper[4835]: I0319 10:43:34.953312 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.034908 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podUID="0f2be57b-1212-4dc2-b772-3317bcd69aac" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.034917 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.035114 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.117962 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" podUID="e86ab409-7643-4e6f-9129-6f90e5b6bf1c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.117966 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.200997 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.283930 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.284066 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.284107 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" podUID="e86ab409-7643-4e6f-9129-6f90e5b6bf1c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.284195 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.366066 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.366054 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.366358 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.366383 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.449043 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.449150 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.530941 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.530941 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.612927 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613124 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613261 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613288 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613324 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613421 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.613448 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.614825 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"9d6b5b62d82043260aad7652065797450dc874fcd88254e57a1b198017ae6817"} pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.614877 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" containerID="cri-o://9d6b5b62d82043260aad7652065797450dc874fcd88254e57a1b198017ae6817" gracePeriod=30 Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.713427 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.713517 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.722311 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.722365 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.817943 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.817971 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818066 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818108 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818469 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818520 4835 patch_prober.go:28] interesting pod/monitoring-plugin-7b6f7975cf-xd4n6 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818542 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podUID="5573de0d-e8de-4c32-b778-1cf95556c219" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818685 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.818918 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819107 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819399 4835 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819683 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819737 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-47sfq" podUID="3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819838 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819863 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819953 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.819976 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.821440 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"950062dd692baeb4800174478a0712b6b269d28daa10f23a95333aa34f7da980"} pod="metallb-system/frr-k8s-44rxf" containerMessage="Container controller failed liveness probe, will be restarted" Mar 19 10:43:35 crc kubenswrapper[4835]: I0319 10:43:35.821612 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" containerID="cri-o://950062dd692baeb4800174478a0712b6b269d28daa10f23a95333aa34f7da980" gracePeriod=2 Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.077014 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.163086 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.261970 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.322596 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"1e0844d87c0ed3709d5ed5382474a85031288904d9517376a1f701a37ef59c93"} pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.322655 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" containerID="cri-o://1e0844d87c0ed3709d5ed5382474a85031288904d9517376a1f701a37ef59c93" gracePeriod=10 Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.424207 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.424272 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.573987 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.574347 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.574384 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.778521 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-69n76" podUID="86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.782123 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" podUID="f7e81e94-e1a9-47df-86b2-91cdfd01ae1e" containerName="sbdb" probeResult="failure" output="command timed out" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.782508 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-4jp9w" podUID="f7e81e94-e1a9-47df-86b2-91cdfd01ae1e" containerName="nbdb" probeResult="failure" output="command timed out" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.956960 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.957042 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" podUID="9b388096-29e9-4547-b43a-ec3a7935572b" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:36 crc kubenswrapper[4835]: I0319 10:43:36.957147 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.406961 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="84a60e76-5b98-4cdd-b2ea-849d4fcbc215" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.406986 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="84a60e76-5b98-4cdd-b2ea-849d4fcbc215" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.17:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.481078 4835 patch_prober.go:28] interesting pod/console-5d9fcbf4f8-5btdf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.481223 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5d9fcbf4f8-5btdf" podUID="d524b5e2-97d1-47cc-8474-113fa8e6016a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.481363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.552311 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.552367 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.779995 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.784168 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.784370 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 10:43:37 crc kubenswrapper[4835]: I0319 10:43:37.785931 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.306331 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.306872 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.306949 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.306972 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.307094 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.363684 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" podUID="981b2eee-0205-4314-b31a-d01123049b7b" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.482325 4835 patch_prober.go:28] interesting pod/console-5d9fcbf4f8-5btdf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.482376 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5d9fcbf4f8-5btdf" podUID="d524b5e2-97d1-47cc-8474-113fa8e6016a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.536160 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.536197 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.536248 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.536229 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638043 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638144 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638207 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zfhzl" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638227 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638243 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/speaker-zfhzl" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638247 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638262 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.638319 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.639766 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="speaker" containerStatusID={"Type":"cri-o","ID":"02b71afa104a61ebdb0ad57a9c5673d7d6ca7bc8dfe732ff6362578808eb3fdc"} pod="metallb-system/speaker-zfhzl" containerMessage="Container speaker failed liveness probe, will be restarted" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.639827 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" containerID="cri-o://02b71afa104a61ebdb0ad57a9c5673d7d6ca7bc8dfe732ff6362578808eb3fdc" gracePeriod=2 Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.779216 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.779309 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.791574 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975"} pod="openshift-marketplace/redhat-marketplace-l2vsx" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.791678 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" containerID="cri-o://1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975" gracePeriod=30 Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.791786 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.791860 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.793410 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85"} pod="openstack-operators/openstack-operator-index-7pjhg" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.793464 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" containerID="cri-o://d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85" gracePeriod=30 Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.794588 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.794689 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.796307 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.796362 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.850335 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.850532 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:38 crc kubenswrapper[4835]: I0319 10:43:38.967171 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-rd6zz" podUID="3cb1a290-5e83-4b84-80ab-a7d86e85ce98" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.35:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.352930 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.352959 4835 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-qcvsc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.352987 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.353000 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podUID="83538569-9d57-4fdb-83b2-03dbd62bad4d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.352938 4835 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-qcvsc container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.353046 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-qcvsc" podUID="83538569-9d57-4fdb-83b2-03dbd62bad4d" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.388108 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerDied","Data":"950062dd692baeb4800174478a0712b6b269d28daa10f23a95333aa34f7da980"} Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.391196 4835 generic.go:334] "Generic (PLEG): container finished" podID="8173c77e-48ec-44fc-9be7-67381528f78a" containerID="950062dd692baeb4800174478a0712b6b269d28daa10f23a95333aa34f7da980" exitCode=0 Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.397329 4835 generic.go:334] "Generic (PLEG): container finished" podID="9b388096-29e9-4547-b43a-ec3a7935572b" containerID="1e0844d87c0ed3709d5ed5382474a85031288904d9517376a1f701a37ef59c93" exitCode=0 Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.397386 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" event={"ID":"9b388096-29e9-4547-b43a-ec3a7935572b","Type":"ContainerDied","Data":"1e0844d87c0ed3709d5ed5382474a85031288904d9517376a1f701a37ef59c93"} Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.680952 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-zfhzl" podUID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.783594 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.784125 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.788304 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:39 crc kubenswrapper[4835]: I0319 10:43:39.788380 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.092881 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.092932 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.092975 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.093005 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.093069 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.093158 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.094473 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"18e2ca242ad34c1140edf13b90a427133c3ebb9b0d73b6c4d2b51b8adcc39204"} pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" containerMessage="Container operator failed liveness probe, will be restarted" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.094518 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" containerID="cri-o://18e2ca242ad34c1140edf13b90a427133c3ebb9b0d73b6c4d2b51b8adcc39204" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.407672 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc"} pod="openshift-marketplace/community-operators-6mgg7" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.407757 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" containerID="cri-o://b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.434893 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.434953 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.435033 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.434907 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.30:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.435271 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.515028 4835 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.722434 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.722790 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.793461 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:40 crc kubenswrapper[4835]: I0319 10:43:40.793581 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.247936 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-68f9d5b675-fbgr4" podUID="8a05d0dc-7092-45aa-b869-34a23cb0a1f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.477419 4835 patch_prober.go:28] interesting pod/perses-operator-54476d58cc-x6mtx container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.477804 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" podUID="41d1090f-7ab4-4820-a742-dca791692d0f" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.30:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.584912 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.584939 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.584991 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.585071 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.585217 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.585270 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.587047 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="marketplace-operator" containerStatusID={"Type":"cri-o","ID":"e14965fe103b6b5367faa035bf9e70accc869275771525030bb50caff305ced2"} pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" containerMessage="Container marketplace-operator failed liveness probe, will be restarted" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.587114 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" containerID="cri-o://e14965fe103b6b5367faa035bf9e70accc869275771525030bb50caff305ced2" gracePeriod=30 Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.783316 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.783506 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.783760 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.783888 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.786557 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd"} pod="openshift-marketplace/certified-operators-bmh95" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.786653 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" containerID="cri-o://c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd" gracePeriod=30 Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.790645 4835 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-zpbgp container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.790705 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podUID="a5ae9a03-b125-4072-9ec8-fddd19b7002e" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.790729 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.790857 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.835997 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" podUID="017e5e75-5952-4240-8349-d16c367d4bed" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.971842 4835 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-djtzl container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.972247 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" podUID="7490a09e-a8be-4931-a282-38989ba640b3" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.972308 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.988393 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"b8708c7ef18e9cd9095e10195cbb1a442053e10631e321cc169d1b7760cf53f5"} pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 19 10:43:41 crc kubenswrapper[4835]: I0319 10:43:41.988477 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" podUID="7490a09e-a8be-4931-a282-38989ba640b3" containerName="authentication-operator" containerID="cri-o://b8708c7ef18e9cd9095e10195cbb1a442053e10631e321cc169d1b7760cf53f5" gracePeriod=30 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.128587 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q886g container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.128649 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podUID="6d8b5f04-3b9c-4656-bf70-203d508949a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.128711 4835 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q886g container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.128725 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q886g" podUID="6d8b5f04-3b9c-4656-bf70-203d508949a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135021 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135021 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135130 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135226 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135211 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.135321 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.137351 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"cea0cde6bb3738605e0cfad5c42f31c29bae177ecdce6e83d43a882ffd6dc8b3"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.138343 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" containerID="cri-o://cea0cde6bb3738605e0cfad5c42f31c29bae177ecdce6e83d43a882ffd6dc8b3" gracePeriod=30 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.348988 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349036 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349056 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349105 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349130 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349167 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.348968 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349212 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349217 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349236 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349284 4835 patch_prober.go:28] interesting pod/downloads-7954f5f757-vzxqb container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349305 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vzxqb" podUID="80ebd3c0-282e-427f-bedd-f06c26bce30a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349349 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349400 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349567 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.349636 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.350024 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"e727cea677ded79327ba3e32f642a9ea78fa299c657b29426c5ee7f852148d47"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.350063 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" containerID="cri-o://e727cea677ded79327ba3e32f642a9ea78fa299c657b29426c5ee7f852148d47" gracePeriod=30 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.350441 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"22dcfa0277296c303069569d71e94534ce59e5a90251808424b76582f392c86c"} pod="openshift-ingress/router-default-5444994796-mljh4" containerMessage="Container router failed liveness probe, will be restarted" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.350473 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" containerID="cri-o://22dcfa0277296c303069569d71e94534ce59e5a90251808424b76582f392c86c" gracePeriod=10 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.446700 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f751b70-2b62-41da-b71c-ce7039840e3e" containerID="02b71afa104a61ebdb0ad57a9c5673d7d6ca7bc8dfe732ff6362578808eb3fdc" exitCode=137 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.446791 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfhzl" event={"ID":"3f751b70-2b62-41da-b71c-ce7039840e3e","Type":"ContainerDied","Data":"02b71afa104a61ebdb0ad57a9c5673d7d6ca7bc8dfe732ff6362578808eb3fdc"} Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495256 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495620 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495315 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495818 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495870 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.495933 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.497482 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"8fdb826a4067ff1d6a6e33d4a4477be92713e2992ee5b89df4d524cc35f35842"} pod="openshift-console-operator/console-operator-58897d9998-4tpxz" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.497525 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" containerID="cri-o://8fdb826a4067ff1d6a6e33d4a4477be92713e2992ee5b89df4d524cc35f35842" gracePeriod=30 Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.572475 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.572540 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.572649 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.627002 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.627064 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.779165 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.779945 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-69n76" podUID="86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.780033 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.780762 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.787051 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.787098 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.788677 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.837943 4835 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-6p9l9 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.838005 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" podUID="04d6a038-d68f-4f5a-ad4d-95cd686d1d24" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.838103 4835 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-zpbgp container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.838165 4835 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-6p9l9 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.838155 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" podUID="a5ae9a03-b125-4072-9ec8-fddd19b7002e" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.90:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:42 crc kubenswrapper[4835]: I0319 10:43:42.838184 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-6p9l9" podUID="04d6a038-d68f-4f5a-ad4d-95cd686d1d24" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.44:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.136259 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.136331 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.216792 4835 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-rznlv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.216866 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podUID="82e92071-6291-4ff2-971a-c658d2e001ed" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.216961 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.225556 4835 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7mrmh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.77:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.225615 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" podUID="c5e8ffa0-9ad4-4092-8b48-44edf2cb9514" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.77:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.225622 4835 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7mrmh container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.77:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.225696 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-7mrmh" podUID="c5e8ffa0-9ad4-4092-8b48-44edf2cb9514" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.77:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.293467 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": EOF" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.293531 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": EOF" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.390961 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.391298 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.391347 4835 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-cm5jv container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.391415 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podUID="9b90ec5a-5d56-4267-bb4e-1fbcdecff021" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.391522 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.436006 4835 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.436068 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podUID="e0bc0a8b-4a2d-4460-b984-7a2279e1a424" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.436146 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.481459 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-44rxf" event={"ID":"8173c77e-48ec-44fc-9be7-67381528f78a","Type":"ContainerStarted","Data":"fedeea19f7e8ceb4bafcb82012e3d667ae636153c2b7ced9fe7a1583df850880"} Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.481924 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.484995 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" event={"ID":"9b388096-29e9-4547-b43a-ec3a7935572b","Type":"ContainerStarted","Data":"5b1eb9eb4b3635bfc2ae123150d72a3ef5be3895bca527d1270b9935b7cb3e39"} Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.485053 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.488232 4835 generic.go:334] "Generic (PLEG): container finished" podID="f2687a33-8ba7-4a21-9906-84de49945433" containerID="110aa42509437eb4825da61f2dedc7f9dc1d657aa826faf30fd91d9e8f4a59d1" exitCode=0 Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.488266 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerDied","Data":"110aa42509437eb4825da61f2dedc7f9dc1d657aa826faf30fd91d9e8f4a59d1"} Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.488692 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.535772 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.535855 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.535993 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.536083 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.573568 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.573653 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.606531 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.606610 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.606604 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.606688 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.620903 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.623121 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.625071 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.625136 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.680056 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-85c8677745-k75vc" podUID="2a19bff3-1be0-44d7-b625-df3e46afa290" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.680189 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-hzjn2" podUID="981b2eee-0205-4314-b31a-d01123049b7b" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.778521 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.778645 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.779315 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.779426 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.784255 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420"} pod="openshift-marketplace/redhat-operators-n8kcm" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.784337 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" containerID="cri-o://38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420" gracePeriod=30 Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.786006 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output="command timed out" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.786111 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.786126 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.786511 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.786538 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.832840 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.834071 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.835756 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:43 crc kubenswrapper[4835]: E0319 10:43:43.835804 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.850356 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.850477 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="5614bb9c-3907-4c29-b148-fbca6c6642ad" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.171:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:43 crc kubenswrapper[4835]: I0319 10:43:43.994933 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.076954 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.077019 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.077034 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.076968 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.077160 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.077193 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.079547 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"fe4f6e08b22834e89ad25898df6beb0fbd7488400ffad6d20ec190b0bc72562d"} pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.079629 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" containerID="cri-o://fe4f6e08b22834e89ad25898df6beb0fbd7488400ffad6d20ec190b0bc72562d" gracePeriod=2 Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.086229 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.086291 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.086346 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.087779 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"4bde12fa97e46cd393be8e222a95cd64b25ba816714628be50c94aa41aae9643"} pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.087831 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" containerID="cri-o://4bde12fa97e46cd393be8e222a95cd64b25ba816714628be50c94aa41aae9643" gracePeriod=170 Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.192105 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-5cfd84c587-t9dbc" podUID="e01500c9-ffdf-42df-9017-26b813efed0e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.217076 4835 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-rznlv container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.217138 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podUID="82e92071-6291-4ff2-971a-c658d2e001ed" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.218297 4835 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-rznlv container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.218325 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" podUID="82e92071-6291-4ff2-971a-c658d2e001ed" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.273453 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:44 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:44 crc kubenswrapper[4835]: > Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.273537 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:44 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:44 crc kubenswrapper[4835]: > Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.275974 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.278506 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.281102 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.281176 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.335363 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-7d559dcdbd-9rtgc" podUID="9eeda748-4895-4cbc-be03-d1edabf69758" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.335354 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.335507 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.369053 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.369110 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.373355 4835 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-cm5jv container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.373386 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podUID="9b90ec5a-5d56-4267-bb4e-1fbcdecff021" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.430960 4835 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-cm5jv container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.431274 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" podUID="9b90ec5a-5d56-4267-bb4e-1fbcdecff021" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.431040 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-64dc66d669-msxkb" podUID="6dabbeda-38a8-4289-92b6-075d7c20f958" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436077 4835 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436138 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podUID="e0bc0a8b-4a2d-4460-b984-7a2279e1a424" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436201 4835 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": context deadline exceeded" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436221 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="a0a07ea9-b7e5-4679-9b63-c6027e42f279" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": context deadline exceeded" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436485 4835 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-5qfh9 container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.436615 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" podUID="e0bc0a8b-4a2d-4460-b984-7a2279e1a424" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.489443 4835 trace.go:236] Trace[372398106]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (19-Mar-2026 10:43:37.088) (total time: 7397ms): Mar 19 10:43:44 crc kubenswrapper[4835]: Trace[372398106]: [7.39797343s] [7.39797343s] END Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.489465 4835 trace.go:236] Trace[1074532598]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (19-Mar-2026 10:43:38.291) (total time: 6194ms): Mar 19 10:43:44 crc kubenswrapper[4835]: Trace[1074532598]: [6.194740396s] [6.194740396s] END Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.503341 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-4tpxz_abdeec51-5ca9-442c-9937-67dd8f50d88d/console-operator/0.log" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.503528 4835 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.503591 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="b54655c1-accb-4df5-98b0-f01dbcfe83f7" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.503691 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" event={"ID":"abdeec51-5ca9-442c-9937-67dd8f50d88d","Type":"ContainerDied","Data":"8fdb826a4067ff1d6a6e33d4a4477be92713e2992ee5b89df4d524cc35f35842"} Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.503710 4835 generic.go:334] "Generic (PLEG): container finished" podID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerID="8fdb826a4067ff1d6a6e33d4a4477be92713e2992ee5b89df4d524cc35f35842" exitCode=1 Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.504799 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"5d0ed211b308917edaba5288103874e3bf63598f271c81636f1b1490839e51ec"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.536174 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.536436 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.536221 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.536791 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.545876 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6cc65c69fc-bzhnq" podUID="5db66af1-4929-4163-b788-0320b0323a79" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.549284 4835 patch_prober.go:28] interesting pod/monitoring-plugin-7b6f7975cf-xd4n6 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.549332 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" podUID="5573de0d-e8de-4c32-b778-1cf95556c219" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.86:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.608881 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.608989 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.609006 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.609068 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.664854 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" podUID="47fe8883-3cea-4985-8327-4ad721d4e128" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.664960 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.758971 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-76b87776c9-79r7z" podUID="0f2be57b-1212-4dc2-b772-3317bcd69aac" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.759062 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6b77b7676d-mgjjt" podUID="aea377c2-d271-45c9-a574-0d1fe89caac5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.779255 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.781957 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.807109 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.807240 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.827551 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.829730 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.831113 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:44 crc kubenswrapper[4835]: E0319 10:43:44.831158 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.862566 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" podUID="2e429fe7-b053-43e9-a640-f14af7094e62" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.914975 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6f5b7bcd4-d79pg" podUID="e86ab409-7643-4e6f-9129-6f90e5b6bf1c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:44 crc kubenswrapper[4835]: I0319 10:43:44.955978 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-bc5c78db9-wvb5n" podUID="2173f655-b411-4dfe-8247-59ac02942c25" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.015979 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-56f74467c6-v6bxs" podUID="02720ef5-c00c-48a4-af96-4d82f28bf051" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.200921 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" podUID="2021ffd4-3c7f-477a-8816-c9b382d640c7" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.201214 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-846c4cdcb7-7b7zt" podUID="74161dd4-df25-4f1b-8924-c9086688463d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.207125 4835 trace.go:236] Trace[1417665997]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (19-Mar-2026 10:43:43.480) (total time: 1726ms): Mar 19 10:43:45 crc kubenswrapper[4835]: Trace[1417665997]: [1.726504006s] [1.726504006s] END Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.243007 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.243477 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.244073 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" podUID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.284183 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-867f54bc44-sgjfn" podUID="637282e5-c56d-49ea-ac96-f299fb3661f2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.324996 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-s94g8" podUID="91c2a119-b77e-4cf1-9be3-779c47d4643b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.369191 4835 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.369266 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.58:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.436146 4835 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.436494 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="a0a07ea9-b7e5-4679-9b63-c6027e42f279" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.60:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.448944 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" podUID="0b55921b-5660-44b9-abe9-09639766068c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.448944 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" podUID="a59c67ae-bcf7-404f-ad32-233f38450f65" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.449516 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.449604 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.449911 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.449941 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.450448 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-66dd9d474d-bjnww" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.502477 4835 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.502535 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="b54655c1-accb-4df5-98b0-f01dbcfe83f7" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.61:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.514581 4835 generic.go:334] "Generic (PLEG): container finished" podID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerID="e727cea677ded79327ba3e32f642a9ea78fa299c657b29426c5ee7f852148d47" exitCode=0 Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.514637 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" event={"ID":"4cd6d8ad-9a41-4796-9684-2b6e09675bd9","Type":"ContainerDied","Data":"e727cea677ded79327ba3e32f642a9ea78fa299c657b29426c5ee7f852148d47"} Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.516907 4835 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.519240 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfhzl" event={"ID":"3f751b70-2b62-41da-b71c-ce7039840e3e","Type":"ContainerStarted","Data":"af07c59ab5b180b6f5b101fd84aae35ea9e5cd004f9bfb928567a36544e853db"} Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.519383 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zfhzl" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.677002 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-47sfq" podUID="3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.677086 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-47sfq" podUID="3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.712300 4835 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.712367 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.722818 4835 patch_prober.go:28] interesting pod/thanos-querier-5b688b744b-2d86m container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.722880 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5b688b744b-2d86m" podUID="c02bf820-ae15-4787-964c-de647e0e6ebd" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.83:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.780854 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:45 crc kubenswrapper[4835]: I0319 10:43:45.853201 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" podUID="a73c4956-ad63-40d9-907e-599172ac3771" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: E0319 10:43:46.148412 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:46 crc kubenswrapper[4835]: E0319 10:43:46.150020 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:46 crc kubenswrapper[4835]: E0319 10:43:46.151218 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:46 crc kubenswrapper[4835]: E0319 10:43:46.151254 4835 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.286937 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" podUID="dd4067de-94b9-4d91-bdf4-e3e3af91b76f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.371985 4835 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.372054 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.491990 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" podUID="65b9e1e9-a183-4d37-a281-593752da6125" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.530693 4835 generic.go:334] "Generic (PLEG): container finished" podID="7490a09e-a8be-4931-a282-38989ba640b3" containerID="b8708c7ef18e9cd9095e10195cbb1a442053e10631e321cc169d1b7760cf53f5" exitCode=0 Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.530845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" event={"ID":"7490a09e-a8be-4931-a282-38989ba640b3","Type":"ContainerDied","Data":"b8708c7ef18e9cd9095e10195cbb1a442053e10631e321cc169d1b7760cf53f5"} Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.535440 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f2687a33-8ba7-4a21-9906-84de49945433","Type":"ContainerStarted","Data":"a78a0bd3efe2007a327eacdec684c43c3305a3ac389eff4f72b53b40403b2bbd"} Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.537919 4835 generic.go:334] "Generic (PLEG): container finished" podID="817b88ed-bcd4-4702-bd72-7d04de779c86" containerID="fe4f6e08b22834e89ad25898df6beb0fbd7488400ffad6d20ec190b0bc72562d" exitCode=137 Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.538013 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" event={"ID":"817b88ed-bcd4-4702-bd72-7d04de779c86","Type":"ContainerDied","Data":"fe4f6e08b22834e89ad25898df6beb0fbd7488400ffad6d20ec190b0bc72562d"} Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.613954 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.613978 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.614061 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.615698 4835 patch_prober.go:28] interesting pod/route-controller-manager-5f9b46c45c-4586h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.615761 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" podUID="ef156d19-8841-4be5-a739-bd07a7789ea3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.779549 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-69n76" podUID="86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.779722 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 10:43:46 crc kubenswrapper[4835]: I0319 10:43:46.997329 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-69n76" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.085887 4835 status_manager.go:875] "Failed to update status for pod" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5573de0d-e8de-4c32-b778-1cf95556c219\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:43:34Z\\\",\\\"message\\\":\\\"containers with unready status: [monitoring-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:43:34Z\\\",\\\"message\\\":\\\"containers with unready status: [monitoring-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9d523e82f5fcc16a85097ba9569ef6169314c8f813893acbf645a1db88e59df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b5e205b553835b386518450db730ab666280a63d52f6df4e582571fc1d44fd5c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b5e205b553835b386518450db730ab666280a63d52f6df4e582571fc1d44fd5c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"monitoring-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T09:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"monitoring-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-monitoring\"/\"monitoring-plugin-7b6f7975cf-xd4n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": context deadline exceeded" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.281406 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.484618 4835 patch_prober.go:28] interesting pod/console-5d9fcbf4f8-5btdf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.485004 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-5d9fcbf4f8-5btdf" podUID="d524b5e2-97d1-47cc-8474-113fa8e6016a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.570833 4835 generic.go:334] "Generic (PLEG): container finished" podID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerID="d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.570958 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pjhg" event={"ID":"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01","Type":"ContainerDied","Data":"d7893d110d6e01606a62e0ec1eb6f02d747b64208ade1d58054cc369010d4d85"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.595383 4835 generic.go:334] "Generic (PLEG): container finished" podID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerID="1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.595500 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerDied","Data":"1e040f261e4ac81fc3f8d977a239f87bc08614949d71bf77a60e889bb22a6975"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.600091 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-4tpxz_abdeec51-5ca9-442c-9937-67dd8f50d88d/console-operator/0.log" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.600438 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" event={"ID":"abdeec51-5ca9-442c-9937-67dd8f50d88d","Type":"ContainerStarted","Data":"eaf0c228bc54a233be06a6dca2302d63ef4ac7ad7ad991a6d2eaa5aecea496d9"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.601084 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.601646 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.601685 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.604807 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" event={"ID":"4cd6d8ad-9a41-4796-9684-2b6e09675bd9","Type":"ContainerStarted","Data":"3b1002a2992157d43fa99cd3c2ce3aa7226c352e3e3119042fec503b88e20c06"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.605390 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.605528 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.605591 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.613914 4835 generic.go:334] "Generic (PLEG): container finished" podID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerID="38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.613952 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerDied","Data":"38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.622687 4835 generic.go:334] "Generic (PLEG): container finished" podID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerID="c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.622786 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerDied","Data":"c2019edc4e22b6a962539a0a6e1c736cef59878b6ba66be09c70a0d6d132b3cd"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.626960 4835 generic.go:334] "Generic (PLEG): container finished" podID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerID="e14965fe103b6b5367faa035bf9e70accc869275771525030bb50caff305ced2" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.627046 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" event={"ID":"dd4d4255-8149-4b0e-b3f2-dc0951a043a5","Type":"ContainerDied","Data":"e14965fe103b6b5367faa035bf9e70accc869275771525030bb50caff305ced2"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.629043 4835 generic.go:334] "Generic (PLEG): container finished" podID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerID="cea0cde6bb3738605e0cfad5c42f31c29bae177ecdce6e83d43a882ffd6dc8b3" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.630258 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" event={"ID":"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f","Type":"ContainerDied","Data":"cea0cde6bb3738605e0cfad5c42f31c29bae177ecdce6e83d43a882ffd6dc8b3"} Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.655902 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" podUID="5ea1b1ba-826f-4abe-9c56-caedc3a178f9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.780304 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:47 crc kubenswrapper[4835]: I0319 10:43:47.788189 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="30460e8f-6c1c-4bdb-a1d7-18427b62896b" containerName="prometheus" probeResult="failure" output="command timed out" Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.265786 4835 patch_prober.go:28] interesting pod/loki-operator-controller-manager-668b645cb5-fhzgr container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.266534 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" podUID="3a2991b5-2e25-4afa-9941-d955aad0dc37" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:48 crc kubenswrapper[4835]: E0319 10:43:48.378249 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420 is running failed: container process not found" containerID="38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:48 crc kubenswrapper[4835]: E0319 10:43:48.384881 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420 is running failed: container process not found" containerID="38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:48 crc kubenswrapper[4835]: E0319 10:43:48.385699 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420 is running failed: container process not found" containerID="38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:43:48 crc kubenswrapper[4835]: E0319 10:43:48.385908 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 38d05bf91884467227e16e4996c2fbf8aa3db4c6a3dc043670424affb0c51420 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.443121 4835 trace.go:236] Trace[152036336]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (19-Mar-2026 10:43:47.049) (total time: 1393ms): Mar 19 10:43:48 crc kubenswrapper[4835]: Trace[152036336]: [1.393529252s] [1.393529252s] END Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.660418 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7pjhg" event={"ID":"d95e8e04-5d17-45e5-aeb8-d6857bfc9d01","Type":"ContainerStarted","Data":"12beed8a0c7aa4fa91d3b62ea8944ac2fc10794ca27353f966a263c4c879558c"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.713650 4835 generic.go:334] "Generic (PLEG): container finished" podID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerID="18e2ca242ad34c1140edf13b90a427133c3ebb9b0d73b6c4d2b51b8adcc39204" exitCode=0 Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.715001 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" event={"ID":"95fe9c35-69f6-4b60-a725-c2f0d8a34c99","Type":"ContainerDied","Data":"18e2ca242ad34c1140edf13b90a427133c3ebb9b0d73b6c4d2b51b8adcc39204"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.721563 4835 generic.go:334] "Generic (PLEG): container finished" podID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerID="b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc" exitCode=0 Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.721637 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerDied","Data":"b4f76f5deedf47e1bcf6b97488562a22b55c6553015dcacaa009a0e6e1097dfc"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.737817 4835 generic.go:334] "Generic (PLEG): container finished" podID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerID="9d6b5b62d82043260aad7652065797450dc874fcd88254e57a1b198017ae6817" exitCode=0 Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.737916 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" event={"ID":"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46","Type":"ContainerDied","Data":"9d6b5b62d82043260aad7652065797450dc874fcd88254e57a1b198017ae6817"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.745036 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" event={"ID":"817b88ed-bcd4-4702-bd72-7d04de779c86","Type":"ContainerStarted","Data":"9e04c9c62cfc4a0d698739e96ce0c82e48c0d46b4f86f246f46e4bd403ea870e"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.746174 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.759734 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-djtzl" event={"ID":"7490a09e-a8be-4931-a282-38989ba640b3","Type":"ContainerStarted","Data":"ae143f90a7b0fa0ccaad5c085740062e9324c02a47e438570e1a0a14f5e20eb6"} Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.760312 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.760342 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.766892 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 10:43:48 crc kubenswrapper[4835]: I0319 10:43:48.767108 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.009326 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" start-of-body= Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.009371 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.354015 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54476d58cc-x6mtx" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.678841 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.700283 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.791485 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-575wd" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.819278 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" event={"ID":"dd4d4255-8149-4b0e-b3f2-dc0951a043a5","Type":"ContainerStarted","Data":"5fc26958424624a730facb0b97bbfc5720ceebddfb00abcc0ad0606d8eaaefe9"} Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.820339 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.832408 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.832476 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.862393 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" event={"ID":"0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f","Type":"ContainerStarted","Data":"126724b59bef64aa7eea804dfe907aa16ae3610dec52cb0ce76641fc34720487"} Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.862791 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.863795 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 19 10:43:49 crc kubenswrapper[4835]: I0319 10:43:49.863836 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.502633 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.503260 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.502815 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.503532 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.929215 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" event={"ID":"d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46","Type":"ContainerStarted","Data":"588fc357bbbded3b825e10bc8a6c243a39e4c2af919a13d6fc53e5ab99402995"} Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.929515 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.929943 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.929990 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.939279 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8kcm" event={"ID":"2e9b0984-cab2-4aab-bad4-b4ad7040a40f","Type":"ContainerStarted","Data":"60bab837a524e1a829c1a94ae0a8b0f03a1c0c4cf02dc233940f57fc5e67623b"} Mar 19 10:43:50 crc kubenswrapper[4835]: I0319 10:43:50.971080 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bmh95" event={"ID":"8b78cdf3-88ba-4ab2-9966-492863d9206c","Type":"ContainerStarted","Data":"a23a14cf8c6dfce1f6cf859630e3eb1e13ac82422ef9c358a9fccf44417893a9"} Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:50.999451 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l2vsx" event={"ID":"5d586c5b-694e-4ac9-aa09-0d973cdad7e0","Type":"ContainerStarted","Data":"b6eebb849fca360ebc3fac320515b287b59a06c8f95c0cba95dbd8143b1c187b"} Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.016911 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" event={"ID":"95fe9c35-69f6-4b60-a725-c2f0d8a34c99","Type":"ContainerStarted","Data":"428cd51011402045b5f2802c91e261119b49b3ecf4454a06b0e9c964424446c6"} Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.019257 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.021027 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.021060 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.031044 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mgg7" event={"ID":"e8f17f81-d3ac-4c40-b346-c3eac9cc70d2","Type":"ContainerStarted","Data":"4bb58927864510b900dca416111182e2e2963a42d7cb985455bed85351b2a238"} Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.033287 4835 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4bzw9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.033320 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.033354 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.033326 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" podUID="dd4d4255-8149-4b0e-b3f2-dc0951a043a5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.71:8080/healthz\": dial tcp 10.217.0.71:8080: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.139699 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.139768 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.140441 4835 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6lp28 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.140475 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" podUID="0c03a84d-d3a7-46e0-b8cb-f0f3689e6a1f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.224459 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.224868 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.224551 4835 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9qxb7 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.225245 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" podUID="4cd6d8ad-9a41-4796-9684-2b6e09675bd9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.399176 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zpbgp" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.494997 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.495052 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.495292 4835 patch_prober.go:28] interesting pod/console-operator-58897d9998-4tpxz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.495311 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" podUID="abdeec51-5ca9-442c-9937-67dd8f50d88d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.654721 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Mar 19 10:43:51 crc kubenswrapper[4835]: [+]has-synced ok Mar 19 10:43:51 crc kubenswrapper[4835]: [-]process-running failed: reason withheld Mar 19 10:43:51 crc kubenswrapper[4835]: healthz check failed Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.654807 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:51 crc kubenswrapper[4835]: I0319 10:43:51.777064 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.044870 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" start-of-body= Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.045162 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.045377 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.045418 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.568152 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-5qfh9" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.569995 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-rznlv" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.576694 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-cm5jv" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.737162 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" containerID="cri-o://5d0ed211b308917edaba5288103874e3bf63598f271c81636f1b1490839e51ec" gracePeriod=22 Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.782079 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.914495 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" containerID="cri-o://7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7" gracePeriod=20 Mar 19 10:43:52 crc kubenswrapper[4835]: I0319 10:43:52.986124 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6674476dff-54wn4" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.121955 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-mljh4_540880cb-fdb4-4672-81e3-60adfd584bdf/router/0.log" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.122024 4835 generic.go:334] "Generic (PLEG): container finished" podID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerID="22dcfa0277296c303069569d71e94534ce59e5a90251808424b76582f392c86c" exitCode=137 Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.122297 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mljh4" event={"ID":"540880cb-fdb4-4672-81e3-60adfd584bdf","Type":"ContainerDied","Data":"22dcfa0277296c303069569d71e94534ce59e5a90251808424b76582f392c86c"} Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.122795 4835 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-qsz5n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" start-of-body= Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.122828 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" podUID="95fe9c35-69f6-4b60-a725-c2f0d8a34c99" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.29:8081/healthz\": dial tcp 10.217.0.29:8081: connect: connection refused" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.535921 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-4f49c container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.536174 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-4f49c" podUID="2ff291e5-8364-4627-be0f-51c9532e46ee" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.565397 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6d77645966-dhr9f" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.605845 4835 patch_prober.go:28] interesting pod/logging-loki-gateway-5d45f4dcf6-hkx57 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.606157 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-5d45f4dcf6-hkx57" podUID="75ce358f-5f03-401f-bdf8-27a7e5309227" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.613145 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.613506 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.780027 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerName="galera" probeResult="failure" output="command timed out" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.831821 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.832167 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:43:53 crc kubenswrapper[4835]: I0319 10:43:53.890703 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6744dd545c-4nqvq" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.135212 4835 patch_prober.go:28] interesting pod/metrics-server-8646b978bb-zprxl container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.135309 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" podUID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.85:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.449437 4835 patch_prober.go:28] interesting pod/controller-manager-5675974fc9-xqhf6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.449787 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" podUID="d2ac78bf-be7d-4b9f-92ed-a75a4bc9cc46" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.456638 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-fbf7bbb96-v6mc6" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.459357 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7b6f7975cf-xd4n6" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.464252 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-659fb58c6b-vq45h" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.471513 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-74d6f7b5c-64vf7" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.567003 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-44rxf" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.582048 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-57b8dbd499-gz2nz" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.612509 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-g4bvm" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.651975 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.816849 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:43:54 crc kubenswrapper[4835]: I0319 10:43:54.817059 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:43:55 crc kubenswrapper[4835]: I0319 10:43:55.070924 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" containerID="cri-o://4835d12e1d2bb983d920ee435b8065501df1842902efa2c1dcccd8b0e3ab8339" gracePeriod=14 Mar 19 10:43:55 crc kubenswrapper[4835]: I0319 10:43:55.261234 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-mljh4_540880cb-fdb4-4672-81e3-60adfd584bdf/router/0.log" Mar 19 10:43:55 crc kubenswrapper[4835]: I0319 10:43:55.261407 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-mljh4" event={"ID":"540880cb-fdb4-4672-81e3-60adfd584bdf","Type":"ContainerStarted","Data":"c15bb71edd31e0658fc2593538fb7435f468fecf897a558c8b4e963da7b14589"} Mar 19 10:43:55 crc kubenswrapper[4835]: I0319 10:43:55.518928 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-44rxf" podUID="8173c77e-48ec-44fc-9be7-67381528f78a" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:55 crc kubenswrapper[4835]: I0319 10:43:55.772839 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5595c7d6ff-zmwm8" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.145451 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.146961 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.206601 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.209200 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.209267 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.494022 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:56 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:56 crc kubenswrapper[4835]: > Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.498510 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:56 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:56 crc kubenswrapper[4835]: > Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.552098 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": dial tcp 10.217.0.66:6443: connect: connection refused" start-of-body= Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.552424 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": dial tcp 10.217.0.66:6443: connect: connection refused" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.571351 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f9b46c45c-4586h" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.585169 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack-operators/openstack-operator-index-7pjhg" podUID="d95e8e04-5d17-45e5-aeb8-d6857bfc9d01" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:56 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:56 crc kubenswrapper[4835]: > Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.783022 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.907504 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d9fcbf4f8-5btdf" Mar 19 10:43:56 crc kubenswrapper[4835]: I0319 10:43:56.978241 4835 trace.go:236] Trace[374282512]: "Calculate volume metrics of glance for pod openstack/glance-default-external-api-0" (19-Mar-2026 10:43:53.796) (total time: 3173ms): Mar 19 10:43:56 crc kubenswrapper[4835]: Trace[374282512]: [3.173852347s] [3.173852347s] END Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.207507 4835 patch_prober.go:28] interesting pod/router-default-5444994796-mljh4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.207855 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-mljh4" podUID="540880cb-fdb4-4672-81e3-60adfd584bdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.256010 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-668b645cb5-fhzgr" Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.308496 4835 generic.go:334] "Generic (PLEG): container finished" podID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerID="4835d12e1d2bb983d920ee435b8065501df1842902efa2c1dcccd8b0e3ab8339" exitCode=0 Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.308721 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" event={"ID":"a3421bac-5e2c-494d-ba71-500a3ead9076","Type":"ContainerDied","Data":"4835d12e1d2bb983d920ee435b8065501df1842902efa2c1dcccd8b0e3ab8339"} Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.319562 4835 generic.go:334] "Generic (PLEG): container finished" podID="fdee895b-c681-4d66-bd0d-08622d97cbea" containerID="5d0ed211b308917edaba5288103874e3bf63598f271c81636f1b1490839e51ec" exitCode=0 Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.319616 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerDied","Data":"5d0ed211b308917edaba5288103874e3bf63598f271c81636f1b1490839e51ec"} Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.589272 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zfhzl" Mar 19 10:43:57 crc kubenswrapper[4835]: I0319 10:43:57.716642 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:43:57 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:43:57 crc kubenswrapper[4835]: > Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.220973 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.368082 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.368680 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.373069 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fdee895b-c681-4d66-bd0d-08622d97cbea","Type":"ContainerStarted","Data":"3222e5973803066e960d4ed42393f2da0ab81ddb74b74fa09cd7f3bec8aecee2"} Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.373120 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:58 crc kubenswrapper[4835]: I0319 10:43:58.380017 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-mljh4" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.011132 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-qsz5n" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.390828 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" event={"ID":"a3421bac-5e2c-494d-ba71-500a3ead9076","Type":"ContainerStarted","Data":"1f9579e85db253e8161bfda464eec898401a0b84478d4ede668dac8eaeacc664"} Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.393314 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.396890 4835 patch_prober.go:28] interesting pod/oauth-openshift-766799bf97-q8249 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": dial tcp 10.217.0.66:6443: connect: connection refused" start-of-body= Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.396946 4835 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" podUID="a3421bac-5e2c-494d-ba71-500a3ead9076" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": dial tcp 10.217.0.66:6443: connect: connection refused" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.800338 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.800656 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.802195 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"68fb76fa4e06c2d826d44492db5604412b6ed8bafa74a0b0bb5679b189c1d3a3"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 19 10:43:59 crc kubenswrapper[4835]: I0319 10:43:59.802285 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerName="cinder-scheduler" containerID="cri-o://68fb76fa4e06c2d826d44492db5604412b6ed8bafa74a0b0bb5679b189c1d3a3" gracePeriod=30 Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.016921 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:00 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:00 crc kubenswrapper[4835]: > Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.267070 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565284-7pfb8"] Mar 19 10:44:00 crc kubenswrapper[4835]: E0319 10:44:00.281358 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7551a433-d892-46ce-addb-cbeb079c9ad3" containerName="oc" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.281409 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7551a433-d892-46ce-addb-cbeb079c9ad3" containerName="oc" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.281904 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7551a433-d892-46ce-addb-cbeb079c9ad3" containerName="oc" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.296327 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.309772 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.309771 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.310382 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.358356 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-7pfb8"] Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.407286 4835 generic.go:334] "Generic (PLEG): container finished" podID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerID="7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7" exitCode=0 Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.414764 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscdn\" (UniqueName: \"kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn\") pod \"auto-csr-approver-29565284-7pfb8\" (UID: \"f9ea73bf-ce42-4431-a66c-867f50ddb0c5\") " pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.437194 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-766799bf97-q8249" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.437778 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerDied","Data":"7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7"} Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.519000 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hscdn\" (UniqueName: \"kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn\") pod \"auto-csr-approver-29565284-7pfb8\" (UID: \"f9ea73bf-ce42-4431-a66c-867f50ddb0c5\") " pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.525494 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4bzw9" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.583657 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscdn\" (UniqueName: \"kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn\") pod \"auto-csr-approver-29565284-7pfb8\" (UID: \"f9ea73bf-ce42-4431-a66c-867f50ddb0c5\") " pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:00 crc kubenswrapper[4835]: I0319 10:44:00.666983 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:01 crc kubenswrapper[4835]: E0319 10:44:01.050151 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7 is running failed: container process not found" containerID="7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 19 10:44:01 crc kubenswrapper[4835]: E0319 10:44:01.050904 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7 is running failed: container process not found" containerID="7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 19 10:44:01 crc kubenswrapper[4835]: E0319 10:44:01.051187 4835 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7 is running failed: container process not found" containerID="7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 19 10:44:01 crc kubenswrapper[4835]: E0319 10:44:01.051214 4835 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7670c4f6d27fd821be9f60aff8a356cdffc7bd7e3680b9c24c400da2eb40f0e7 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="8fd594ac-051b-4142-ba53-e974d9c5daa5" containerName="galera" Mar 19 10:44:01 crc kubenswrapper[4835]: I0319 10:44:01.141242 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6lp28" Mar 19 10:44:01 crc kubenswrapper[4835]: I0319 10:44:01.228338 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9qxb7" Mar 19 10:44:01 crc kubenswrapper[4835]: I0319 10:44:01.425448 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8fd594ac-051b-4142-ba53-e974d9c5daa5","Type":"ContainerStarted","Data":"df150af2b9cff748fb39e5abf09e59123a71275ac55b7eb3dc9b90304196cce2"} Mar 19 10:44:01 crc kubenswrapper[4835]: I0319 10:44:01.512210 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4tpxz" Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.256700 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-7pfb8"] Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.496151 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.496399 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.508313 4835 generic.go:334] "Generic (PLEG): container finished" podID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerID="68fb76fa4e06c2d826d44492db5604412b6ed8bafa74a0b0bb5679b189c1d3a3" exitCode=0 Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.508931 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc","Type":"ContainerDied","Data":"68fb76fa4e06c2d826d44492db5604412b6ed8bafa74a0b0bb5679b189c1d3a3"} Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.519900 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" event={"ID":"f9ea73bf-ce42-4431-a66c-867f50ddb0c5","Type":"ContainerStarted","Data":"e9b57e70862e06366a6171477661d044c3a48690121511e252bd53b91b9a4801"} Mar 19 10:44:02 crc kubenswrapper[4835]: I0319 10:44:02.942774 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6889f84cf4-bbfgf" Mar 19 10:44:03 crc kubenswrapper[4835]: I0319 10:44:03.313669 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 10:44:03 crc kubenswrapper[4835]: I0319 10:44:03.684291 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 10:44:03 crc kubenswrapper[4835]: I0319 10:44:03.884459 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:44:03 crc kubenswrapper[4835]: I0319 10:44:03.934417 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7pjhg" Mar 19 10:44:04 crc kubenswrapper[4835]: I0319 10:44:04.454959 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5675974fc9-xqhf6" Mar 19 10:44:04 crc kubenswrapper[4835]: I0319 10:44:04.676354 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-l2vsx" podUID="5d586c5b-694e-4ac9-aa09-0d973cdad7e0" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:04 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:04 crc kubenswrapper[4835]: > Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.197816 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.200469 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.237570 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.324731 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nslh\" (UniqueName: \"kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.324929 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.324980 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.427203 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.427278 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.427470 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nslh\" (UniqueName: \"kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.428594 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.429237 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.467885 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nslh\" (UniqueName: \"kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh\") pod \"redhat-marketplace-kqpxn\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.553535 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.575854 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" event={"ID":"f9ea73bf-ce42-4431-a66c-867f50ddb0c5","Type":"ContainerStarted","Data":"fbbec1a85a8a16ad79debf3dba8b048c55c0f329af54936da178aec9f1f5e04b"} Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.578531 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc","Type":"ContainerStarted","Data":"949b69a2f98d8ddaf0f64335e345dd7ad85fd3f2c0b6f4a7a88ccadf4185f348"} Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.641570 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" podStartSLOduration=4.412133017 podStartE2EDuration="5.637581347s" podCreationTimestamp="2026-03-19 10:44:00 +0000 UTC" firstStartedPulling="2026-03-19 10:44:02.282495541 +0000 UTC m=+4897.131094128" lastFinishedPulling="2026-03-19 10:44:03.507943871 +0000 UTC m=+4898.356542458" observedRunningTime="2026-03-19 10:44:05.636027295 +0000 UTC m=+4900.484625882" watchObservedRunningTime="2026-03-19 10:44:05.637581347 +0000 UTC m=+4900.486179934" Mar 19 10:44:05 crc kubenswrapper[4835]: I0319 10:44:05.878666 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:05 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:05 crc kubenswrapper[4835]: > Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.300481 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.434191 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.434252 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.445396 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.446252 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.446306 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350" gracePeriod=600 Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.594496 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350" exitCode=0 Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.594583 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350"} Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.596570 4835 scope.go:117] "RemoveContainer" containerID="3359a5103ce1a19d3b0272d49b7320107bc1e5fdca65b50c1858632d55b7dee9" Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.599419 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerStarted","Data":"e65ba073f334e55c1a8b189f292d8234509a4250b89a71197b344b3710013ea1"} Mar 19 10:44:06 crc kubenswrapper[4835]: I0319 10:44:06.732548 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.208776 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:07 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:07 crc kubenswrapper[4835]: > Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.612454 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dc32758-f369-4e39-95a4-406e4433ada9" containerID="05a8be5ca13b699532ed9d07dc54dbc395c2f4b8fc5d166dc594e9602eb66738" exitCode=0 Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.612563 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerDied","Data":"05a8be5ca13b699532ed9d07dc54dbc395c2f4b8fc5d166dc594e9602eb66738"} Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.616262 4835 generic.go:334] "Generic (PLEG): container finished" podID="f9ea73bf-ce42-4431-a66c-867f50ddb0c5" containerID="fbbec1a85a8a16ad79debf3dba8b048c55c0f329af54936da178aec9f1f5e04b" exitCode=0 Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.616305 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" event={"ID":"f9ea73bf-ce42-4431-a66c-867f50ddb0c5","Type":"ContainerDied","Data":"fbbec1a85a8a16ad79debf3dba8b048c55c0f329af54936da178aec9f1f5e04b"} Mar 19 10:44:07 crc kubenswrapper[4835]: I0319 10:44:07.620257 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02"} Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.583522 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.651035 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" event={"ID":"f9ea73bf-ce42-4431-a66c-867f50ddb0c5","Type":"ContainerDied","Data":"e9b57e70862e06366a6171477661d044c3a48690121511e252bd53b91b9a4801"} Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.651045 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-7pfb8" Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.651946 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9b57e70862e06366a6171477661d044c3a48690121511e252bd53b91b9a4801" Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.652194 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hscdn\" (UniqueName: \"kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn\") pod \"f9ea73bf-ce42-4431-a66c-867f50ddb0c5\" (UID: \"f9ea73bf-ce42-4431-a66c-867f50ddb0c5\") " Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.653075 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerStarted","Data":"2eea2bafbc6b981257fd158cf6a808c9fcc481621d305a796c6670fa3e29b291"} Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.660417 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:09 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:09 crc kubenswrapper[4835]: > Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.665679 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn" (OuterVolumeSpecName: "kube-api-access-hscdn") pod "f9ea73bf-ce42-4431-a66c-867f50ddb0c5" (UID: "f9ea73bf-ce42-4431-a66c-867f50ddb0c5"). InnerVolumeSpecName "kube-api-access-hscdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.756932 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hscdn\" (UniqueName: \"kubernetes.io/projected/f9ea73bf-ce42-4431-a66c-867f50ddb0c5-kube-api-access-hscdn\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.765872 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-v2nnh"] Mar 19 10:44:09 crc kubenswrapper[4835]: I0319 10:44:09.781151 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-v2nnh"] Mar 19 10:44:10 crc kubenswrapper[4835]: I0319 10:44:10.417919 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd41c8b-8f1f-4707-9ecd-dcac069b8601" path="/var/lib/kubelet/pods/5bd41c8b-8f1f-4707-9ecd-dcac069b8601/volumes" Mar 19 10:44:10 crc kubenswrapper[4835]: I0319 10:44:10.668255 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dc32758-f369-4e39-95a4-406e4433ada9" containerID="2eea2bafbc6b981257fd158cf6a808c9fcc481621d305a796c6670fa3e29b291" exitCode=0 Mar 19 10:44:10 crc kubenswrapper[4835]: I0319 10:44:10.668318 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerDied","Data":"2eea2bafbc6b981257fd158cf6a808c9fcc481621d305a796c6670fa3e29b291"} Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.048955 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.049574 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.246691 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.683532 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerStarted","Data":"43d4e213d02165803a798811e2d24a2055e579fa42c1aa01b4b3b25250558f96"} Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.713382 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kqpxn" podStartSLOduration=3.295355349 podStartE2EDuration="6.713358285s" podCreationTimestamp="2026-03-19 10:44:05 +0000 UTC" firstStartedPulling="2026-03-19 10:44:07.614941165 +0000 UTC m=+4902.463539752" lastFinishedPulling="2026-03-19 10:44:11.032944081 +0000 UTC m=+4905.881542688" observedRunningTime="2026-03-19 10:44:11.708083211 +0000 UTC m=+4906.556681808" watchObservedRunningTime="2026-03-19 10:44:11.713358285 +0000 UTC m=+4906.561956872" Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.757312 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:44:11 crc kubenswrapper[4835]: I0319 10:44:11.943334 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 10:44:12 crc kubenswrapper[4835]: I0319 10:44:12.695565 4835 generic.go:334] "Generic (PLEG): container finished" podID="9262f37d-2193-46c6-9706-5d039fa94926" containerID="6a27cf424f6e34c5a4241e15982f732fa6b27d5b479b621269327434fe56f4f6" exitCode=1 Mar 19 10:44:12 crc kubenswrapper[4835]: I0319 10:44:12.695644 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9262f37d-2193-46c6-9706-5d039fa94926","Type":"ContainerDied","Data":"6a27cf424f6e34c5a4241e15982f732fa6b27d5b479b621269327434fe56f4f6"} Mar 19 10:44:13 crc kubenswrapper[4835]: I0319 10:44:13.824829 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:44:13 crc kubenswrapper[4835]: I0319 10:44:13.927860 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l2vsx" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.591248 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687488 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687597 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687636 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687805 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687840 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.687925 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.688514 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.688877 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data" (OuterVolumeSpecName: "config-data") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.689027 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nphdq\" (UniqueName: \"kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.689085 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.689129 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir\") pod \"9262f37d-2193-46c6-9706-5d039fa94926\" (UID: \"9262f37d-2193-46c6-9706-5d039fa94926\") " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.690269 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.690302 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.697953 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.698656 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.699189 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq" (OuterVolumeSpecName: "kube-api-access-nphdq") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "kube-api-access-nphdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.741022 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.745208 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"9262f37d-2193-46c6-9706-5d039fa94926","Type":"ContainerDied","Data":"e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756"} Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.745366 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e2202c0af79e048e9fa33710377eda2ce94a0533b5e792fc77222084f71756" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.745244 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.759053 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.785504 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.793235 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.793421 4835 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.793515 4835 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.794239 4835 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9262f37d-2193-46c6-9706-5d039fa94926-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.795070 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nphdq\" (UniqueName: \"kubernetes.io/projected/9262f37d-2193-46c6-9706-5d039fa94926-kube-api-access-nphdq\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.795105 4835 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/9262f37d-2193-46c6-9706-5d039fa94926-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.797383 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9262f37d-2193-46c6-9706-5d039fa94926" (UID: "9262f37d-2193-46c6-9706-5d039fa94926"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.851913 4835 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.897162 4835 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:14 crc kubenswrapper[4835]: I0319 10:44:14.897202 4835 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9262f37d-2193-46c6-9706-5d039fa94926-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:15 crc kubenswrapper[4835]: I0319 10:44:15.555373 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:15 crc kubenswrapper[4835]: I0319 10:44:15.555810 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:15 crc kubenswrapper[4835]: I0319 10:44:15.608171 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:15 crc kubenswrapper[4835]: I0319 10:44:15.869192 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6mgg7" podUID="e8f17f81-d3ac-4c40-b346-c3eac9cc70d2" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:15 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:15 crc kubenswrapper[4835]: > Mar 19 10:44:16 crc kubenswrapper[4835]: I0319 10:44:16.763339 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 10:44:16 crc kubenswrapper[4835]: I0319 10:44:16.820558 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:17 crc kubenswrapper[4835]: I0319 10:44:17.203669 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bmh95" podUID="8b78cdf3-88ba-4ab2-9966-492863d9206c" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:17 crc kubenswrapper[4835]: > Mar 19 10:44:19 crc kubenswrapper[4835]: I0319 10:44:19.430629 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:19 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:19 crc kubenswrapper[4835]: > Mar 19 10:44:19 crc kubenswrapper[4835]: I0319 10:44:19.994930 4835 scope.go:117] "RemoveContainer" containerID="ab9588b05466d098683c827857f9fb54a571c1b570ae5065737b693429dd9731" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.010709 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 10:44:23 crc kubenswrapper[4835]: E0319 10:44:23.011973 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ea73bf-ce42-4431-a66c-867f50ddb0c5" containerName="oc" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.011990 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ea73bf-ce42-4431-a66c-867f50ddb0c5" containerName="oc" Mar 19 10:44:23 crc kubenswrapper[4835]: E0319 10:44:23.012021 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9262f37d-2193-46c6-9706-5d039fa94926" containerName="tempest-tests-tempest-tests-runner" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.012028 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="9262f37d-2193-46c6-9706-5d039fa94926" containerName="tempest-tests-tempest-tests-runner" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.012317 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="9262f37d-2193-46c6-9706-5d039fa94926" containerName="tempest-tests-tempest-tests-runner" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.012344 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ea73bf-ce42-4431-a66c-867f50ddb0c5" containerName="oc" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.015240 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.017592 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n8mkp" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.031884 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.199682 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhn2d\" (UniqueName: \"kubernetes.io/projected/9ab97db4-5ea3-43de-9f19-b16ad72b24ce-kube-api-access-dhn2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.200231 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.302567 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhn2d\" (UniqueName: \"kubernetes.io/projected/9ab97db4-5ea3-43de-9f19-b16ad72b24ce-kube-api-access-dhn2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.302708 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.303809 4835 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.343810 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhn2d\" (UniqueName: \"kubernetes.io/projected/9ab97db4-5ea3-43de-9f19-b16ad72b24ce-kube-api-access-dhn2d\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.397738 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9ab97db4-5ea3-43de-9f19-b16ad72b24ce\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:23 crc kubenswrapper[4835]: I0319 10:44:23.644539 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 10:44:24 crc kubenswrapper[4835]: I0319 10:44:24.137368 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 10:44:24 crc kubenswrapper[4835]: W0319 10:44:24.144401 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab97db4_5ea3_43de_9f19_b16ad72b24ce.slice/crio-7f918cfd2f8bffe9ca5fd827c61673a3a9c8ad3c699d48997d06257d06f3a79d WatchSource:0}: Error finding container 7f918cfd2f8bffe9ca5fd827c61673a3a9c8ad3c699d48997d06257d06f3a79d: Status 404 returned error can't find the container with id 7f918cfd2f8bffe9ca5fd827c61673a3a9c8ad3c699d48997d06257d06f3a79d Mar 19 10:44:24 crc kubenswrapper[4835]: I0319 10:44:24.862154 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9ab97db4-5ea3-43de-9f19-b16ad72b24ce","Type":"ContainerStarted","Data":"7f918cfd2f8bffe9ca5fd827c61673a3a9c8ad3c699d48997d06257d06f3a79d"} Mar 19 10:44:24 crc kubenswrapper[4835]: I0319 10:44:24.879623 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:44:24 crc kubenswrapper[4835]: I0319 10:44:24.933913 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6mgg7" Mar 19 10:44:26 crc kubenswrapper[4835]: I0319 10:44:26.201338 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:44:26 crc kubenswrapper[4835]: I0319 10:44:26.253363 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bmh95" Mar 19 10:44:26 crc kubenswrapper[4835]: I0319 10:44:26.896200 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9ab97db4-5ea3-43de-9f19-b16ad72b24ce","Type":"ContainerStarted","Data":"23149592b1f584054e000b493065526b43ec2d8cd740d4b4c93b52ea475232a0"} Mar 19 10:44:26 crc kubenswrapper[4835]: I0319 10:44:26.927775 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.375608461 podStartE2EDuration="4.927732391s" podCreationTimestamp="2026-03-19 10:44:22 +0000 UTC" firstStartedPulling="2026-03-19 10:44:24.148264268 +0000 UTC m=+4918.996862855" lastFinishedPulling="2026-03-19 10:44:25.700388198 +0000 UTC m=+4920.548986785" observedRunningTime="2026-03-19 10:44:26.916244657 +0000 UTC m=+4921.764843254" watchObservedRunningTime="2026-03-19 10:44:26.927732391 +0000 UTC m=+4921.776330978" Mar 19 10:44:27 crc kubenswrapper[4835]: I0319 10:44:27.382228 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:27 crc kubenswrapper[4835]: I0319 10:44:27.397404 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kqpxn" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="registry-server" containerID="cri-o://43d4e213d02165803a798811e2d24a2055e579fa42c1aa01b4b3b25250558f96" gracePeriod=2 Mar 19 10:44:27 crc kubenswrapper[4835]: I0319 10:44:27.964463 4835 generic.go:334] "Generic (PLEG): container finished" podID="7dc32758-f369-4e39-95a4-406e4433ada9" containerID="43d4e213d02165803a798811e2d24a2055e579fa42c1aa01b4b3b25250558f96" exitCode=0 Mar 19 10:44:27 crc kubenswrapper[4835]: I0319 10:44:27.964514 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerDied","Data":"43d4e213d02165803a798811e2d24a2055e579fa42c1aa01b4b3b25250558f96"} Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.503859 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.671816 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nslh\" (UniqueName: \"kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh\") pod \"7dc32758-f369-4e39-95a4-406e4433ada9\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.671929 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities\") pod \"7dc32758-f369-4e39-95a4-406e4433ada9\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.672000 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content\") pod \"7dc32758-f369-4e39-95a4-406e4433ada9\" (UID: \"7dc32758-f369-4e39-95a4-406e4433ada9\") " Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.676146 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities" (OuterVolumeSpecName: "utilities") pod "7dc32758-f369-4e39-95a4-406e4433ada9" (UID: "7dc32758-f369-4e39-95a4-406e4433ada9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.683430 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh" (OuterVolumeSpecName: "kube-api-access-8nslh") pod "7dc32758-f369-4e39-95a4-406e4433ada9" (UID: "7dc32758-f369-4e39-95a4-406e4433ada9"). InnerVolumeSpecName "kube-api-access-8nslh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.701034 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dc32758-f369-4e39-95a4-406e4433ada9" (UID: "7dc32758-f369-4e39-95a4-406e4433ada9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.774775 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nslh\" (UniqueName: \"kubernetes.io/projected/7dc32758-f369-4e39-95a4-406e4433ada9-kube-api-access-8nslh\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.774810 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.774823 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc32758-f369-4e39-95a4-406e4433ada9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.977896 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kqpxn" event={"ID":"7dc32758-f369-4e39-95a4-406e4433ada9","Type":"ContainerDied","Data":"e65ba073f334e55c1a8b189f292d8234509a4250b89a71197b344b3710013ea1"} Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.977948 4835 scope.go:117] "RemoveContainer" containerID="43d4e213d02165803a798811e2d24a2055e579fa42c1aa01b4b3b25250558f96" Mar 19 10:44:28 crc kubenswrapper[4835]: I0319 10:44:28.977984 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kqpxn" Mar 19 10:44:29 crc kubenswrapper[4835]: I0319 10:44:29.005826 4835 scope.go:117] "RemoveContainer" containerID="2eea2bafbc6b981257fd158cf6a808c9fcc481621d305a796c6670fa3e29b291" Mar 19 10:44:29 crc kubenswrapper[4835]: I0319 10:44:29.046874 4835 scope.go:117] "RemoveContainer" containerID="05a8be5ca13b699532ed9d07dc54dbc395c2f4b8fc5d166dc594e9602eb66738" Mar 19 10:44:29 crc kubenswrapper[4835]: I0319 10:44:29.047807 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:29 crc kubenswrapper[4835]: I0319 10:44:29.085680 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kqpxn"] Mar 19 10:44:29 crc kubenswrapper[4835]: I0319 10:44:29.445052 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:29 crc kubenswrapper[4835]: > Mar 19 10:44:30 crc kubenswrapper[4835]: I0319 10:44:30.415908 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" path="/var/lib/kubelet/pods/7dc32758-f369-4e39-95a4-406e4433ada9/volumes" Mar 19 10:44:39 crc kubenswrapper[4835]: I0319 10:44:39.423647 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8kcm" podUID="2e9b0984-cab2-4aab-bad4-b4ad7040a40f" containerName="registry-server" probeResult="failure" output=< Mar 19 10:44:39 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:44:39 crc kubenswrapper[4835]: > Mar 19 10:44:48 crc kubenswrapper[4835]: I0319 10:44:48.476317 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:44:48 crc kubenswrapper[4835]: I0319 10:44:48.541045 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8kcm" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.188530 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4"] Mar 19 10:45:00 crc kubenswrapper[4835]: E0319 10:45:00.189629 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="registry-server" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.189645 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="registry-server" Mar 19 10:45:00 crc kubenswrapper[4835]: E0319 10:45:00.189667 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="extract-utilities" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.189673 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="extract-utilities" Mar 19 10:45:00 crc kubenswrapper[4835]: E0319 10:45:00.189724 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="extract-content" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.189731 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="extract-content" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.190080 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc32758-f369-4e39-95a4-406e4433ada9" containerName="registry-server" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.191077 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.193197 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.207302 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4"] Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.209031 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.246986 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.247389 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkc7\" (UniqueName: \"kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.247467 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.350012 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.350154 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkc7\" (UniqueName: \"kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.350680 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.350939 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.387465 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.389925 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkc7\" (UniqueName: \"kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7\") pod \"collect-profiles-29565285-dn9h4\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:00 crc kubenswrapper[4835]: I0319 10:45:00.586037 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:02 crc kubenswrapper[4835]: I0319 10:45:02.059728 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4"] Mar 19 10:45:02 crc kubenswrapper[4835]: I0319 10:45:02.358517 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" event={"ID":"1665b9df-2570-406e-a3d6-97255948b678","Type":"ContainerStarted","Data":"22f9bbce25443fd10a8b4a661f38301490aae0ed5e1dec81e06b4fd2e7ce8169"} Mar 19 10:45:02 crc kubenswrapper[4835]: I0319 10:45:02.358851 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" event={"ID":"1665b9df-2570-406e-a3d6-97255948b678","Type":"ContainerStarted","Data":"b96a73a2715b5a2d8a449c98b5ea9c4ae4ac2752e2306ff2a00769d6378b0902"} Mar 19 10:45:02 crc kubenswrapper[4835]: I0319 10:45:02.382887 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" podStartSLOduration=2.38286949 podStartE2EDuration="2.38286949s" podCreationTimestamp="2026-03-19 10:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:02.372582748 +0000 UTC m=+4957.221181355" watchObservedRunningTime="2026-03-19 10:45:02.38286949 +0000 UTC m=+4957.231468077" Mar 19 10:45:03 crc kubenswrapper[4835]: I0319 10:45:03.373568 4835 generic.go:334] "Generic (PLEG): container finished" podID="1665b9df-2570-406e-a3d6-97255948b678" containerID="22f9bbce25443fd10a8b4a661f38301490aae0ed5e1dec81e06b4fd2e7ce8169" exitCode=0 Mar 19 10:45:03 crc kubenswrapper[4835]: I0319 10:45:03.373860 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" event={"ID":"1665b9df-2570-406e-a3d6-97255948b678","Type":"ContainerDied","Data":"22f9bbce25443fd10a8b4a661f38301490aae0ed5e1dec81e06b4fd2e7ce8169"} Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.079631 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.192838 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume\") pod \"1665b9df-2570-406e-a3d6-97255948b678\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.193254 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume\") pod \"1665b9df-2570-406e-a3d6-97255948b678\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.193333 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvkc7\" (UniqueName: \"kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7\") pod \"1665b9df-2570-406e-a3d6-97255948b678\" (UID: \"1665b9df-2570-406e-a3d6-97255948b678\") " Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.198580 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume" (OuterVolumeSpecName: "config-volume") pod "1665b9df-2570-406e-a3d6-97255948b678" (UID: "1665b9df-2570-406e-a3d6-97255948b678"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.214408 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1665b9df-2570-406e-a3d6-97255948b678" (UID: "1665b9df-2570-406e-a3d6-97255948b678"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.215361 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7" (OuterVolumeSpecName: "kube-api-access-rvkc7") pod "1665b9df-2570-406e-a3d6-97255948b678" (UID: "1665b9df-2570-406e-a3d6-97255948b678"). InnerVolumeSpecName "kube-api-access-rvkc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.297100 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1665b9df-2570-406e-a3d6-97255948b678-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.297164 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvkc7\" (UniqueName: \"kubernetes.io/projected/1665b9df-2570-406e-a3d6-97255948b678-kube-api-access-rvkc7\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.297189 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1665b9df-2570-406e-a3d6-97255948b678-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.394114 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" event={"ID":"1665b9df-2570-406e-a3d6-97255948b678","Type":"ContainerDied","Data":"b96a73a2715b5a2d8a449c98b5ea9c4ae4ac2752e2306ff2a00769d6378b0902"} Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.394982 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-dn9h4" Mar 19 10:45:05 crc kubenswrapper[4835]: I0319 10:45:05.395119 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96a73a2715b5a2d8a449c98b5ea9c4ae4ac2752e2306ff2a00769d6378b0902" Mar 19 10:45:06 crc kubenswrapper[4835]: I0319 10:45:06.197832 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf"] Mar 19 10:45:06 crc kubenswrapper[4835]: I0319 10:45:06.221670 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565240-88tjf"] Mar 19 10:45:06 crc kubenswrapper[4835]: I0319 10:45:06.435638 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cb21a9-c134-49a3-b8b1-c9289e9a52a2" path="/var/lib/kubelet/pods/c9cb21a9-c134-49a3-b8b1-c9289e9a52a2/volumes" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.416056 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjtgn/must-gather-27lsm"] Mar 19 10:45:10 crc kubenswrapper[4835]: E0319 10:45:10.417489 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1665b9df-2570-406e-a3d6-97255948b678" containerName="collect-profiles" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.417505 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1665b9df-2570-406e-a3d6-97255948b678" containerName="collect-profiles" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.417720 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1665b9df-2570-406e-a3d6-97255948b678" containerName="collect-profiles" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.419028 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.421029 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bjtgn"/"default-dockercfg-vv95m" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.421981 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bjtgn"/"openshift-service-ca.crt" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.423367 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bjtgn"/"kube-root-ca.crt" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.459364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bjtgn/must-gather-27lsm"] Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.562566 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.562624 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z48h\" (UniqueName: \"kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.664997 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.665061 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z48h\" (UniqueName: \"kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.665920 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.718944 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z48h\" (UniqueName: \"kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h\") pod \"must-gather-27lsm\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:10 crc kubenswrapper[4835]: I0319 10:45:10.737210 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:45:11 crc kubenswrapper[4835]: I0319 10:45:11.453937 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bjtgn/must-gather-27lsm"] Mar 19 10:45:11 crc kubenswrapper[4835]: I0319 10:45:11.472222 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/must-gather-27lsm" event={"ID":"33211b2a-7ffa-44b3-b5ac-66891d2e56c1","Type":"ContainerStarted","Data":"030e65aefad5d2e7e16d414736de4dcc5a0b53150915ea4327a7423752cd9ea0"} Mar 19 10:45:20 crc kubenswrapper[4835]: I0319 10:45:20.214708 4835 scope.go:117] "RemoveContainer" containerID="a1c8561e2c3cc7d94620750aac17b05cfdad1945bd793b6ea0f0dbe435d81f05" Mar 19 10:45:22 crc kubenswrapper[4835]: I0319 10:45:22.652152 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/must-gather-27lsm" event={"ID":"33211b2a-7ffa-44b3-b5ac-66891d2e56c1","Type":"ContainerStarted","Data":"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9"} Mar 19 10:45:23 crc kubenswrapper[4835]: I0319 10:45:23.670845 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/must-gather-27lsm" event={"ID":"33211b2a-7ffa-44b3-b5ac-66891d2e56c1","Type":"ContainerStarted","Data":"35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25"} Mar 19 10:45:23 crc kubenswrapper[4835]: I0319 10:45:23.704471 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bjtgn/must-gather-27lsm" podStartSLOduration=2.880283892 podStartE2EDuration="13.704449167s" podCreationTimestamp="2026-03-19 10:45:10 +0000 UTC" firstStartedPulling="2026-03-19 10:45:11.461638368 +0000 UTC m=+4966.310236955" lastFinishedPulling="2026-03-19 10:45:22.285803623 +0000 UTC m=+4977.134402230" observedRunningTime="2026-03-19 10:45:23.694206897 +0000 UTC m=+4978.542805484" watchObservedRunningTime="2026-03-19 10:45:23.704449167 +0000 UTC m=+4978.553047754" Mar 19 10:45:28 crc kubenswrapper[4835]: I0319 10:45:28.901254 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-fmvcc"] Mar 19 10:45:28 crc kubenswrapper[4835]: I0319 10:45:28.904119 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.061804 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.062318 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9rw\" (UniqueName: \"kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.165397 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9rw\" (UniqueName: \"kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.165852 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.167241 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.189441 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9rw\" (UniqueName: \"kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw\") pod \"crc-debug-fmvcc\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.228291 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:45:29 crc kubenswrapper[4835]: I0319 10:45:29.745381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" event={"ID":"917d428d-2b04-417e-be9e-13d5e250cbef","Type":"ContainerStarted","Data":"668f1d0d8374a23e0de2cf01631dc854c870c8ba774bb81dd13d1389221e7aa2"} Mar 19 10:45:30 crc kubenswrapper[4835]: E0319 10:45:30.839905 4835 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.116:39864->38.129.56.116:37913: write tcp 38.129.56.116:39864->38.129.56.116:37913: write: broken pipe Mar 19 10:45:42 crc kubenswrapper[4835]: I0319 10:45:42.972334 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" event={"ID":"917d428d-2b04-417e-be9e-13d5e250cbef","Type":"ContainerStarted","Data":"aef605412de6df9fc282dff82e300dbc65385f644226bb771c97fec2df4250e5"} Mar 19 10:45:43 crc kubenswrapper[4835]: I0319 10:45:43.002183 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" podStartSLOduration=1.628068998 podStartE2EDuration="15.002157354s" podCreationTimestamp="2026-03-19 10:45:28 +0000 UTC" firstStartedPulling="2026-03-19 10:45:29.280114856 +0000 UTC m=+4984.128713443" lastFinishedPulling="2026-03-19 10:45:42.654203212 +0000 UTC m=+4997.502801799" observedRunningTime="2026-03-19 10:45:42.985240222 +0000 UTC m=+4997.833838809" watchObservedRunningTime="2026-03-19 10:45:43.002157354 +0000 UTC m=+4997.850755941" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.159378 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565286-8q757"] Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.178040 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.179773 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-8q757"] Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.180430 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.185122 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.188846 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.225857 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbx6p\" (UniqueName: \"kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p\") pod \"auto-csr-approver-29565286-8q757\" (UID: \"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8\") " pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.328650 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbx6p\" (UniqueName: \"kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p\") pod \"auto-csr-approver-29565286-8q757\" (UID: \"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8\") " pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.348720 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbx6p\" (UniqueName: \"kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p\") pod \"auto-csr-approver-29565286-8q757\" (UID: \"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8\") " pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:00 crc kubenswrapper[4835]: I0319 10:46:00.804638 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:01 crc kubenswrapper[4835]: I0319 10:46:01.620580 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-8q757"] Mar 19 10:46:02 crc kubenswrapper[4835]: I0319 10:46:02.233669 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-8q757" event={"ID":"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8","Type":"ContainerStarted","Data":"d2cb55cc57af4c9744d0132d45c55cc303232293616ae4c40c4d32c9424d5ddf"} Mar 19 10:46:04 crc kubenswrapper[4835]: I0319 10:46:04.260064 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-8q757" event={"ID":"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8","Type":"ContainerStarted","Data":"31c49db545e80737889cd9983386a237245ecf3e2e70b117ec8a6da26b1a6fa4"} Mar 19 10:46:04 crc kubenswrapper[4835]: I0319 10:46:04.292945 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565286-8q757" podStartSLOduration=3.152424841 podStartE2EDuration="4.29292447s" podCreationTimestamp="2026-03-19 10:46:00 +0000 UTC" firstStartedPulling="2026-03-19 10:46:01.638511833 +0000 UTC m=+5016.487110420" lastFinishedPulling="2026-03-19 10:46:02.779011462 +0000 UTC m=+5017.627610049" observedRunningTime="2026-03-19 10:46:04.27902435 +0000 UTC m=+5019.127622937" watchObservedRunningTime="2026-03-19 10:46:04.29292447 +0000 UTC m=+5019.141523057" Mar 19 10:46:06 crc kubenswrapper[4835]: I0319 10:46:06.286221 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-8q757" event={"ID":"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8","Type":"ContainerDied","Data":"31c49db545e80737889cd9983386a237245ecf3e2e70b117ec8a6da26b1a6fa4"} Mar 19 10:46:06 crc kubenswrapper[4835]: I0319 10:46:06.286910 4835 generic.go:334] "Generic (PLEG): container finished" podID="a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" containerID="31c49db545e80737889cd9983386a237245ecf3e2e70b117ec8a6da26b1a6fa4" exitCode=0 Mar 19 10:46:06 crc kubenswrapper[4835]: I0319 10:46:06.422120 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:46:06 crc kubenswrapper[4835]: I0319 10:46:06.422579 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:46:07 crc kubenswrapper[4835]: I0319 10:46:07.854242 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.037263 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbx6p\" (UniqueName: \"kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p\") pod \"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8\" (UID: \"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8\") " Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.053698 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p" (OuterVolumeSpecName: "kube-api-access-hbx6p") pod "a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" (UID: "a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8"). InnerVolumeSpecName "kube-api-access-hbx6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.140495 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbx6p\" (UniqueName: \"kubernetes.io/projected/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8-kube-api-access-hbx6p\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.317115 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-8q757" event={"ID":"a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8","Type":"ContainerDied","Data":"d2cb55cc57af4c9744d0132d45c55cc303232293616ae4c40c4d32c9424d5ddf"} Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.317387 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2cb55cc57af4c9744d0132d45c55cc303232293616ae4c40c4d32c9424d5ddf" Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.317150 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-8q757" Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.423299 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-4mxsw"] Mar 19 10:46:08 crc kubenswrapper[4835]: I0319 10:46:08.423340 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-4mxsw"] Mar 19 10:46:10 crc kubenswrapper[4835]: I0319 10:46:10.417019 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc08d060-225f-4f8d-bb5e-2de67700397b" path="/var/lib/kubelet/pods/bc08d060-225f-4f8d-bb5e-2de67700397b/volumes" Mar 19 10:46:15 crc kubenswrapper[4835]: I0319 10:46:15.414892 4835 generic.go:334] "Generic (PLEG): container finished" podID="da878929-ea5e-40f0-8eaf-7f6b6e86f62c" containerID="4bde12fa97e46cd393be8e222a95cd64b25ba816714628be50c94aa41aae9643" exitCode=0 Mar 19 10:46:15 crc kubenswrapper[4835]: I0319 10:46:15.414944 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" event={"ID":"da878929-ea5e-40f0-8eaf-7f6b6e86f62c","Type":"ContainerDied","Data":"4bde12fa97e46cd393be8e222a95cd64b25ba816714628be50c94aa41aae9643"} Mar 19 10:46:15 crc kubenswrapper[4835]: I0319 10:46:15.415300 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" event={"ID":"da878929-ea5e-40f0-8eaf-7f6b6e86f62c","Type":"ContainerStarted","Data":"d74815d021c48a59b35d0f38402773fe39a057e9d0b4610645f81d9549212186"} Mar 19 10:46:22 crc kubenswrapper[4835]: I0319 10:46:22.388081 4835 scope.go:117] "RemoveContainer" containerID="efa3020b05cafbbec5d1c9e7177b5f54c6ed4f29b4789d0c40dd5de521bb6817" Mar 19 10:46:31 crc kubenswrapper[4835]: I0319 10:46:31.651453 4835 generic.go:334] "Generic (PLEG): container finished" podID="917d428d-2b04-417e-be9e-13d5e250cbef" containerID="aef605412de6df9fc282dff82e300dbc65385f644226bb771c97fec2df4250e5" exitCode=0 Mar 19 10:46:31 crc kubenswrapper[4835]: I0319 10:46:31.651527 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" event={"ID":"917d428d-2b04-417e-be9e-13d5e250cbef","Type":"ContainerDied","Data":"aef605412de6df9fc282dff82e300dbc65385f644226bb771c97fec2df4250e5"} Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.807758 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.851049 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-fmvcc"] Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.863929 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-fmvcc"] Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.865800 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b9rw\" (UniqueName: \"kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw\") pod \"917d428d-2b04-417e-be9e-13d5e250cbef\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.866040 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host\") pod \"917d428d-2b04-417e-be9e-13d5e250cbef\" (UID: \"917d428d-2b04-417e-be9e-13d5e250cbef\") " Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.866960 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host" (OuterVolumeSpecName: "host") pod "917d428d-2b04-417e-be9e-13d5e250cbef" (UID: "917d428d-2b04-417e-be9e-13d5e250cbef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.874015 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw" (OuterVolumeSpecName: "kube-api-access-9b9rw") pod "917d428d-2b04-417e-be9e-13d5e250cbef" (UID: "917d428d-2b04-417e-be9e-13d5e250cbef"). InnerVolumeSpecName "kube-api-access-9b9rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.968942 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/917d428d-2b04-417e-be9e-13d5e250cbef-host\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:32 crc kubenswrapper[4835]: I0319 10:46:32.969220 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b9rw\" (UniqueName: \"kubernetes.io/projected/917d428d-2b04-417e-be9e-13d5e250cbef-kube-api-access-9b9rw\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:33 crc kubenswrapper[4835]: I0319 10:46:33.086510 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 10:46:33 crc kubenswrapper[4835]: I0319 10:46:33.086566 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 10:46:33 crc kubenswrapper[4835]: I0319 10:46:33.676002 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="668f1d0d8374a23e0de2cf01631dc854c870c8ba774bb81dd13d1389221e7aa2" Mar 19 10:46:33 crc kubenswrapper[4835]: I0319 10:46:33.676054 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-fmvcc" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.152088 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-czpm6"] Mar 19 10:46:34 crc kubenswrapper[4835]: E0319 10:46:34.170959 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" containerName="oc" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.171016 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" containerName="oc" Mar 19 10:46:34 crc kubenswrapper[4835]: E0319 10:46:34.171165 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="917d428d-2b04-417e-be9e-13d5e250cbef" containerName="container-00" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.171181 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="917d428d-2b04-417e-be9e-13d5e250cbef" containerName="container-00" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.173137 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="917d428d-2b04-417e-be9e-13d5e250cbef" containerName="container-00" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.173272 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" containerName="oc" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.175706 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.301889 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnldv\" (UniqueName: \"kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.302292 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.404442 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.404599 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.404637 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnldv\" (UniqueName: \"kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.419903 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="917d428d-2b04-417e-be9e-13d5e250cbef" path="/var/lib/kubelet/pods/917d428d-2b04-417e-be9e-13d5e250cbef/volumes" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.430255 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnldv\" (UniqueName: \"kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv\") pod \"crc-debug-czpm6\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.504655 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:34 crc kubenswrapper[4835]: W0319 10:46:34.560274 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ab118c_4d88_498d_90ad_75bae4ebfb93.slice/crio-2de8e463ebb66c88403ff9f8137b85f3d46a028d93f456f80df4ba557bef971d WatchSource:0}: Error finding container 2de8e463ebb66c88403ff9f8137b85f3d46a028d93f456f80df4ba557bef971d: Status 404 returned error can't find the container with id 2de8e463ebb66c88403ff9f8137b85f3d46a028d93f456f80df4ba557bef971d Mar 19 10:46:34 crc kubenswrapper[4835]: I0319 10:46:34.686545 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" event={"ID":"f9ab118c-4d88-498d-90ad-75bae4ebfb93","Type":"ContainerStarted","Data":"2de8e463ebb66c88403ff9f8137b85f3d46a028d93f456f80df4ba557bef971d"} Mar 19 10:46:35 crc kubenswrapper[4835]: I0319 10:46:35.698889 4835 generic.go:334] "Generic (PLEG): container finished" podID="f9ab118c-4d88-498d-90ad-75bae4ebfb93" containerID="84162b84376d5ec7317c4ed4ecfbfd20e4350399d0d59f84cf5116466c629aa8" exitCode=0 Mar 19 10:46:35 crc kubenswrapper[4835]: I0319 10:46:35.699155 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" event={"ID":"f9ab118c-4d88-498d-90ad-75bae4ebfb93","Type":"ContainerDied","Data":"84162b84376d5ec7317c4ed4ecfbfd20e4350399d0d59f84cf5116466c629aa8"} Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.422719 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.422785 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.827715 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.856787 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host\") pod \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.856897 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host" (OuterVolumeSpecName: "host") pod "f9ab118c-4d88-498d-90ad-75bae4ebfb93" (UID: "f9ab118c-4d88-498d-90ad-75bae4ebfb93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.856979 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnldv\" (UniqueName: \"kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv\") pod \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\" (UID: \"f9ab118c-4d88-498d-90ad-75bae4ebfb93\") " Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.857708 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9ab118c-4d88-498d-90ad-75bae4ebfb93-host\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.867950 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv" (OuterVolumeSpecName: "kube-api-access-qnldv") pod "f9ab118c-4d88-498d-90ad-75bae4ebfb93" (UID: "f9ab118c-4d88-498d-90ad-75bae4ebfb93"). InnerVolumeSpecName "kube-api-access-qnldv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:36 crc kubenswrapper[4835]: I0319 10:46:36.959670 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnldv\" (UniqueName: \"kubernetes.io/projected/f9ab118c-4d88-498d-90ad-75bae4ebfb93-kube-api-access-qnldv\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:37 crc kubenswrapper[4835]: I0319 10:46:37.434286 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-czpm6"] Mar 19 10:46:37 crc kubenswrapper[4835]: I0319 10:46:37.445044 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-czpm6"] Mar 19 10:46:37 crc kubenswrapper[4835]: I0319 10:46:37.725540 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2de8e463ebb66c88403ff9f8137b85f3d46a028d93f456f80df4ba557bef971d" Mar 19 10:46:37 crc kubenswrapper[4835]: I0319 10:46:37.725732 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-czpm6" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.416870 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ab118c-4d88-498d-90ad-75bae4ebfb93" path="/var/lib/kubelet/pods/f9ab118c-4d88-498d-90ad-75bae4ebfb93/volumes" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.623129 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-jv9n9"] Mar 19 10:46:38 crc kubenswrapper[4835]: E0319 10:46:38.623780 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ab118c-4d88-498d-90ad-75bae4ebfb93" containerName="container-00" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.623804 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ab118c-4d88-498d-90ad-75bae4ebfb93" containerName="container-00" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.624079 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ab118c-4d88-498d-90ad-75bae4ebfb93" containerName="container-00" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.625124 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.696242 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb42g\" (UniqueName: \"kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.696418 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.799589 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb42g\" (UniqueName: \"kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.799764 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.799983 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.825434 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb42g\" (UniqueName: \"kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g\") pod \"crc-debug-jv9n9\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: I0319 10:46:38.943969 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:38 crc kubenswrapper[4835]: W0319 10:46:38.972876 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4f373e_60de_4d1b_bd2c_70041fc3dc78.slice/crio-1eec701f0d2e91e36c43a593e773e3191c320f447560b48d09216a8e72bd3c13 WatchSource:0}: Error finding container 1eec701f0d2e91e36c43a593e773e3191c320f447560b48d09216a8e72bd3c13: Status 404 returned error can't find the container with id 1eec701f0d2e91e36c43a593e773e3191c320f447560b48d09216a8e72bd3c13 Mar 19 10:46:39 crc kubenswrapper[4835]: I0319 10:46:39.761977 4835 generic.go:334] "Generic (PLEG): container finished" podID="3f4f373e-60de-4d1b-bd2c-70041fc3dc78" containerID="d48b1aae804aad88c3ac95cce75fdea782736df830862b7e5ef2791fa7d806fe" exitCode=0 Mar 19 10:46:39 crc kubenswrapper[4835]: I0319 10:46:39.762076 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" event={"ID":"3f4f373e-60de-4d1b-bd2c-70041fc3dc78","Type":"ContainerDied","Data":"d48b1aae804aad88c3ac95cce75fdea782736df830862b7e5ef2791fa7d806fe"} Mar 19 10:46:39 crc kubenswrapper[4835]: I0319 10:46:39.762546 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" event={"ID":"3f4f373e-60de-4d1b-bd2c-70041fc3dc78","Type":"ContainerStarted","Data":"1eec701f0d2e91e36c43a593e773e3191c320f447560b48d09216a8e72bd3c13"} Mar 19 10:46:39 crc kubenswrapper[4835]: I0319 10:46:39.831901 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-jv9n9"] Mar 19 10:46:39 crc kubenswrapper[4835]: I0319 10:46:39.862771 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjtgn/crc-debug-jv9n9"] Mar 19 10:46:40 crc kubenswrapper[4835]: I0319 10:46:40.891133 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:40 crc kubenswrapper[4835]: I0319 10:46:40.950440 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host\") pod \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " Mar 19 10:46:40 crc kubenswrapper[4835]: I0319 10:46:40.950616 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb42g\" (UniqueName: \"kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g\") pod \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\" (UID: \"3f4f373e-60de-4d1b-bd2c-70041fc3dc78\") " Mar 19 10:46:40 crc kubenswrapper[4835]: I0319 10:46:40.950633 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host" (OuterVolumeSpecName: "host") pod "3f4f373e-60de-4d1b-bd2c-70041fc3dc78" (UID: "3f4f373e-60de-4d1b-bd2c-70041fc3dc78"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:46:40 crc kubenswrapper[4835]: I0319 10:46:40.958512 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g" (OuterVolumeSpecName: "kube-api-access-jb42g") pod "3f4f373e-60de-4d1b-bd2c-70041fc3dc78" (UID: "3f4f373e-60de-4d1b-bd2c-70041fc3dc78"). InnerVolumeSpecName "kube-api-access-jb42g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:41 crc kubenswrapper[4835]: I0319 10:46:41.053734 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb42g\" (UniqueName: \"kubernetes.io/projected/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-kube-api-access-jb42g\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:41 crc kubenswrapper[4835]: I0319 10:46:41.053778 4835 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f4f373e-60de-4d1b-bd2c-70041fc3dc78-host\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:41 crc kubenswrapper[4835]: I0319 10:46:41.788956 4835 scope.go:117] "RemoveContainer" containerID="d48b1aae804aad88c3ac95cce75fdea782736df830862b7e5ef2791fa7d806fe" Mar 19 10:46:41 crc kubenswrapper[4835]: I0319 10:46:41.789031 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/crc-debug-jv9n9" Mar 19 10:46:42 crc kubenswrapper[4835]: I0319 10:46:42.418834 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4f373e-60de-4d1b-bd2c-70041fc3dc78" path="/var/lib/kubelet/pods/3f4f373e-60de-4d1b-bd2c-70041fc3dc78/volumes" Mar 19 10:46:53 crc kubenswrapper[4835]: I0319 10:46:53.091639 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 10:46:53 crc kubenswrapper[4835]: I0319 10:46:53.096684 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8646b978bb-zprxl" Mar 19 10:47:06 crc kubenswrapper[4835]: I0319 10:47:06.421645 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:47:06 crc kubenswrapper[4835]: I0319 10:47:06.422163 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:47:06 crc kubenswrapper[4835]: I0319 10:47:06.422211 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:47:06 crc kubenswrapper[4835]: I0319 10:47:06.426445 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:47:06 crc kubenswrapper[4835]: I0319 10:47:06.426553 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" gracePeriod=600 Mar 19 10:47:06 crc kubenswrapper[4835]: E0319 10:47:06.559354 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:47:07 crc kubenswrapper[4835]: I0319 10:47:07.091339 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" exitCode=0 Mar 19 10:47:07 crc kubenswrapper[4835]: I0319 10:47:07.091392 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02"} Mar 19 10:47:07 crc kubenswrapper[4835]: I0319 10:47:07.091720 4835 scope.go:117] "RemoveContainer" containerID="60e0f97ba1cabbaebe6f78c7e3d6884cdd63695a581c1d99b4549f052fbd5350" Mar 19 10:47:07 crc kubenswrapper[4835]: I0319 10:47:07.092806 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:47:07 crc kubenswrapper[4835]: E0319 10:47:07.093336 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:47:19 crc kubenswrapper[4835]: I0319 10:47:19.402016 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:47:19 crc kubenswrapper[4835]: E0319 10:47:19.403102 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:47:21 crc kubenswrapper[4835]: I0319 10:47:21.517953 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d04558a6-e35c-4376-9893-1cf4865a711b/aodh-api/0.log" Mar 19 10:47:21 crc kubenswrapper[4835]: I0319 10:47:21.703897 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d04558a6-e35c-4376-9893-1cf4865a711b/aodh-evaluator/0.log" Mar 19 10:47:21 crc kubenswrapper[4835]: I0319 10:47:21.759593 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d04558a6-e35c-4376-9893-1cf4865a711b/aodh-notifier/0.log" Mar 19 10:47:21 crc kubenswrapper[4835]: I0319 10:47:21.790896 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_d04558a6-e35c-4376-9893-1cf4865a711b/aodh-listener/0.log" Mar 19 10:47:21 crc kubenswrapper[4835]: I0319 10:47:21.913814 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b499b5bb-ft9bz_509b4973-14f1-43af-8a83-b4de55a65f5f/barbican-api/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.085149 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b499b5bb-ft9bz_509b4973-14f1-43af-8a83-b4de55a65f5f/barbican-api-log/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.122787 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-768fcfbbc8-k7gmg_0f532ad6-0a68-4c59-93b7-5e393908c008/barbican-keystone-listener/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.244599 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-768fcfbbc8-k7gmg_0f532ad6-0a68-4c59-93b7-5e393908c008/barbican-keystone-listener-log/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.403975 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-88c674dc-srwpw_52481530-0f3c-48ac-abfe-2ca7b35d8b07/barbican-worker-log/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.413858 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-88c674dc-srwpw_52481530-0f3c-48ac-abfe-2ca7b35d8b07/barbican-worker/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.701490 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f2687a33-8ba7-4a21-9906-84de49945433/ceilometer-central-agent/1.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.722994 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-dfpj8_63daeb0b-2ace-4747-b331-44ed485faec8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.849289 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f2687a33-8ba7-4a21-9906-84de49945433/ceilometer-central-agent/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.932715 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f2687a33-8ba7-4a21-9906-84de49945433/proxy-httpd/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.961034 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f2687a33-8ba7-4a21-9906-84de49945433/sg-core/0.log" Mar 19 10:47:23 crc kubenswrapper[4835]: I0319 10:47:23.983334 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f2687a33-8ba7-4a21-9906-84de49945433/ceilometer-notification-agent/0.log" Mar 19 10:47:24 crc kubenswrapper[4835]: I0319 10:47:24.196055 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3ca55c1-1697-4203-bcec-3a9e5bd64c59/cinder-api/0.log" Mar 19 10:47:24 crc kubenswrapper[4835]: I0319 10:47:24.211644 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d3ca55c1-1697-4203-bcec-3a9e5bd64c59/cinder-api-log/0.log" Mar 19 10:47:24 crc kubenswrapper[4835]: I0319 10:47:24.419967 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc/cinder-scheduler/0.log" Mar 19 10:47:24 crc kubenswrapper[4835]: I0319 10:47:24.468248 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc/cinder-scheduler/1.log" Mar 19 10:47:24 crc kubenswrapper[4835]: I0319 10:47:24.513373 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a83f49e0-e79c-4a80-ad8b-b86b10c3f4dc/probe/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.447351 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-p6865_027e795d-3d9b-4c8f-b6ae-9b96cac6316e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.588439 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-9mf4t_3d377e09-921d-4439-ad81-cd9b8230cf5e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.658644 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-ln7sh_508dcf13-80b0-4ac1-ba8e-42070d0fe929/init/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.883393 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-ln7sh_508dcf13-80b0-4ac1-ba8e-42070d0fe929/init/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.947229 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-ln7sh_508dcf13-80b0-4ac1-ba8e-42070d0fe929/dnsmasq-dns/0.log" Mar 19 10:47:25 crc kubenswrapper[4835]: I0319 10:47:25.951992 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jwjd2_1073365c-6688-438d-ad77-2c32ee7e6947/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:26 crc kubenswrapper[4835]: I0319 10:47:26.126121 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48489f77-65d9-4e0d-88f5-951d6928dcb3/glance-httpd/0.log" Mar 19 10:47:26 crc kubenswrapper[4835]: I0319 10:47:26.163315 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48489f77-65d9-4e0d-88f5-951d6928dcb3/glance-log/0.log" Mar 19 10:47:26 crc kubenswrapper[4835]: I0319 10:47:26.209855 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ff2e9785-1044-4469-a034-9baeb46ff607/glance-httpd/0.log" Mar 19 10:47:26 crc kubenswrapper[4835]: I0319 10:47:26.451378 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ff2e9785-1044-4469-a034-9baeb46ff607/glance-log/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.101671 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-tkkx5_efe0b48c-18f7-47aa-b470-5bd2fa11fb84/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.155302 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7756684d79-4fxz4_58987da0-27e0-4534-9011-1c1cfd751b68/heat-api/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.258907 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6d45648c65-jxdbn_1c2ee5e0-f8fe-4e29-a951-e9c8a5817cfa/heat-engine/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.265102 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7556f74cf6-zw6cb_abd86360-252d-42c1-bdb5-814ccfca3bac/heat-cfnapi/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.505783 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565241-dx4db_77cc85ba-0a70-4b9e-a5b5-23dc9681723a/keystone-cron/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.681407 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qvj9t_d16f3bd5-80dd-4532-a60e-0a6d214ef185/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:27 crc kubenswrapper[4835]: I0319 10:47:27.799598 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_84a60e76-5b98-4cdd-b2ea-849d4fcbc215/kube-state-metrics/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.166445 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-hp9k6_7e28c8e5-047a-4ea3-8887-7fbca62a9104/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.358207 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-f8d94d56-dsx7s_71b8fa5f-56e9-49ee-b55d-6136c30a460d/keystone-api/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.589067 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-t4gxx_fbf6d84b-d129-481a-80a4-b38c4eb9051b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.595772 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_b815f8f1-d671-416f-adb1-4001dc6891a3/mysqld-exporter/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.903658 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d99fb9659-b272b_decfbe60-e04e-46cc-9228-33bb1c831849/neutron-httpd/0.log" Mar 19 10:47:28 crc kubenswrapper[4835]: I0319 10:47:28.975960 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d99fb9659-b272b_decfbe60-e04e-46cc-9228-33bb1c831849/neutron-api/0.log" Mar 19 10:47:29 crc kubenswrapper[4835]: I0319 10:47:29.002854 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s2vzn_64fbb382-1146-4c37-ad0a-596ce57ae0f5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:29 crc kubenswrapper[4835]: I0319 10:47:29.645770 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_55a6fe01-83c8-4e8a-a871-c94a13bc4bd3/nova-cell0-conductor-conductor/0.log" Mar 19 10:47:29 crc kubenswrapper[4835]: I0319 10:47:29.702197 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_48aa7e56-6bac-4cbd-be4b-13961be604b5/nova-api-log/0.log" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.124525 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_67089e56-05fd-46f8-b595-6ece6f03b14f/nova-cell1-conductor-conductor/0.log" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.129440 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a392e53b-b7fb-4a1a-a93f-8c5bbe49abe2/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.165839 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_48aa7e56-6bac-4cbd-be4b-13961be604b5/nova-api-api/0.log" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.408010 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:47:30 crc kubenswrapper[4835]: E0319 10:47:30.408343 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.608556 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_409c2465-a2ab-4f57-ba21-0854863bc543/memcached/0.log" Mar 19 10:47:30 crc kubenswrapper[4835]: I0319 10:47:30.788721 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9459cf8a-fea8-4f9b-97bd-79333e608f41/nova-metadata-log/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.175045 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_ab42da98-c45d-473c-b86c-acba11787d21/nova-scheduler-scheduler/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.243990 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fdee895b-c681-4d66-bd0d-08622d97cbea/mysql-bootstrap/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.333846 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9459cf8a-fea8-4f9b-97bd-79333e608f41/nova-metadata-metadata/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.334154 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gw97t_316e9619-05d7-4845-84e5-78700d3318d9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.476563 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fdee895b-c681-4d66-bd0d-08622d97cbea/galera/1.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.479696 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fdee895b-c681-4d66-bd0d-08622d97cbea/mysql-bootstrap/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.516508 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fdee895b-c681-4d66-bd0d-08622d97cbea/galera/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.612016 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8fd594ac-051b-4142-ba53-e974d9c5daa5/mysql-bootstrap/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.835598 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8fd594ac-051b-4142-ba53-e974d9c5daa5/mysql-bootstrap/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.894008 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8fd594ac-051b-4142-ba53-e974d9c5daa5/galera/0.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.898772 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8fd594ac-051b-4142-ba53-e974d9c5daa5/galera/1.log" Mar 19 10:47:31 crc kubenswrapper[4835]: I0319 10:47:31.950383 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_636dd5f8-5cc2-46f4-84ac-a094b6881a4f/openstackclient/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.124280 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-nfn4c_fc6a1eab-3ebc-49e6-bb19-f4f127b416a6/ovn-controller/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.147355 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nt4ph_edbbc028-d9c9-4aae-a3a0-76bead2b6738/openstack-network-exporter/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.311067 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dht6h_a176f0d5-28dd-4c7a-9558-8eaeffdec853/ovsdb-server-init/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.495967 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dht6h_a176f0d5-28dd-4c7a-9558-8eaeffdec853/ovsdb-server/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.565532 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dht6h_a176f0d5-28dd-4c7a-9558-8eaeffdec853/ovsdb-server-init/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.569600 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dht6h_a176f0d5-28dd-4c7a-9558-8eaeffdec853/ovs-vswitchd/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.624007 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-s2d4g_3d96a831-67f3-468f-9c03-beeec7c529b4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.803430 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ebda93b4-fc03-4e5f-a63b-7945275df157/openstack-network-exporter/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.828226 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ebda93b4-fc03-4e5f-a63b-7945275df157/ovn-northd/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.883296 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_011f3ac4-3c81-4a32-98a6-00522440cad4/openstack-network-exporter/0.log" Mar 19 10:47:32 crc kubenswrapper[4835]: I0319 10:47:32.989178 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_011f3ac4-3c81-4a32-98a6-00522440cad4/ovsdbserver-nb/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.043850 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a340a11-0445-4522-bd9d-ff96f90a8f16/openstack-network-exporter/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.119596 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8a340a11-0445-4522-bd9d-ff96f90a8f16/ovsdbserver-sb/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.283190 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-74ff56c748-fkbqq_7c130368-e60e-45c8-ad42-45b89cf92cdb/placement-api/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.387924 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-74ff56c748-fkbqq_7c130368-e60e-45c8-ad42-45b89cf92cdb/placement-log/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.405361 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5614bb9c-3907-4c29-b148-fbca6c6642ad/init-config-reloader/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.626546 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5614bb9c-3907-4c29-b148-fbca6c6642ad/config-reloader/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.648324 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5614bb9c-3907-4c29-b148-fbca6c6642ad/init-config-reloader/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.654908 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5614bb9c-3907-4c29-b148-fbca6c6642ad/prometheus/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.674274 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5614bb9c-3907-4c29-b148-fbca6c6642ad/thanos-sidecar/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.812027 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f9fe928-2ee4-486d-a8c6-692169a02f42/setup-container/0.log" Mar 19 10:47:33 crc kubenswrapper[4835]: I0319 10:47:33.989569 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f9fe928-2ee4-486d-a8c6-692169a02f42/rabbitmq/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.055590 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f9fe928-2ee4-486d-a8c6-692169a02f42/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.093934 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_010a4cf7-6b62-4951-9ee4-1588c57d5c28/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.246765 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_010a4cf7-6b62-4951-9ee4-1588c57d5c28/rabbitmq/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.251810 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_010a4cf7-6b62-4951-9ee4-1588c57d5c28/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.294093 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_3a7e887a-3e51-467e-af3b-f26e6126c9e2/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.476536 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_3a7e887a-3e51-467e-af3b-f26e6126c9e2/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.536859 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_3a7e887a-3e51-467e-af3b-f26e6126c9e2/rabbitmq/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.541067 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_d3ab82b9-82ca-4e95-a3a0-1854edb16b7b/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.747361 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_d3ab82b9-82ca-4e95-a3a0-1854edb16b7b/setup-container/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.814927 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gxgs5_0e7e2b75-9097-496a-b7ba-be0340826564/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:34 crc kubenswrapper[4835]: I0319 10:47:34.832217 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_d3ab82b9-82ca-4e95-a3a0-1854edb16b7b/rabbitmq/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.227304 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-mhc4n_a0976232-95f7-45c1-8025-5adf6861e278/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.267584 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-42t6p_2fd23d4d-9b8c-4be8-a671-cefa01a4e341/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.479504 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jpgpp_29e7446f-a78e-434c-a1e3-3046f98fad69/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.528296 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4j28_7481c32a-7096-4407-9b2e-f6c715ef33f6/ssh-known-hosts-edpm-deployment/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.708952 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f95f9444c-phdqp_844aa06e-23ea-4908-9a95-c3bda3620d5c/proxy-server/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.727177 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5f95f9444c-phdqp_844aa06e-23ea-4908-9a95-c3bda3620d5c/proxy-httpd/0.log" Mar 19 10:47:35 crc kubenswrapper[4835]: I0319 10:47:35.812869 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j4wd9_53929f33-eb5f-41e4-8845-d6be1087df58/swift-ring-rebalance/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.009701 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/account-auditor/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.089497 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/account-reaper/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.133504 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/account-server/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.165858 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/account-replicator/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.221086 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/container-auditor/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.319715 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/container-replicator/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.328446 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/container-server/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.340164 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/container-updater/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.373862 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/object-auditor/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.446920 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/object-expirer/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.542623 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/object-updater/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.560222 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/object-replicator/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.561853 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/object-server/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.598917 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/rsync/0.log" Mar 19 10:47:36 crc kubenswrapper[4835]: I0319 10:47:36.662826 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_99792633-55f8-4a37-b7d8-ae770406c69d/swift-recon-cron/0.log" Mar 19 10:47:37 crc kubenswrapper[4835]: I0319 10:47:37.203386 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9ab97db4-5ea3-43de-9f19-b16ad72b24ce/test-operator-logs-container/0.log" Mar 19 10:47:37 crc kubenswrapper[4835]: I0319 10:47:37.250974 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x4hdm_f507ac6e-c9ef-4e19-a67c-f5358fe950b2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:37 crc kubenswrapper[4835]: I0319 10:47:37.309929 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-dx4wc_df6bbc12-578a-4c6e-a199-06baaf1ac6b8/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:37 crc kubenswrapper[4835]: I0319 10:47:37.484996 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lpj7c_1530e76d-2520-4d3a-9aff-ef734194267d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 10:47:37 crc kubenswrapper[4835]: I0319 10:47:37.516105 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_9262f37d-2193-46c6-9706-5d039fa94926/tempest-tests-tempest-tests-runner/0.log" Mar 19 10:47:41 crc kubenswrapper[4835]: I0319 10:47:41.402385 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:47:41 crc kubenswrapper[4835]: E0319 10:47:41.403455 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:47:53 crc kubenswrapper[4835]: I0319 10:47:53.402325 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:47:53 crc kubenswrapper[4835]: E0319 10:47:53.404175 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.165222 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565288-tr2vs"] Mar 19 10:48:00 crc kubenswrapper[4835]: E0319 10:48:00.166557 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4f373e-60de-4d1b-bd2c-70041fc3dc78" containerName="container-00" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.166577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4f373e-60de-4d1b-bd2c-70041fc3dc78" containerName="container-00" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.166871 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4f373e-60de-4d1b-bd2c-70041fc3dc78" containerName="container-00" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.168548 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.171247 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.171592 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.171699 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.180331 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-tr2vs"] Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.276121 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcn45\" (UniqueName: \"kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45\") pod \"auto-csr-approver-29565288-tr2vs\" (UID: \"2328b041-bba8-4288-973a-bc3b4f998713\") " pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.379135 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcn45\" (UniqueName: \"kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45\") pod \"auto-csr-approver-29565288-tr2vs\" (UID: \"2328b041-bba8-4288-973a-bc3b4f998713\") " pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.406527 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcn45\" (UniqueName: \"kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45\") pod \"auto-csr-approver-29565288-tr2vs\" (UID: \"2328b041-bba8-4288-973a-bc3b4f998713\") " pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:00 crc kubenswrapper[4835]: I0319 10:48:00.495147 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:01 crc kubenswrapper[4835]: I0319 10:48:01.171681 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-tr2vs"] Mar 19 10:48:01 crc kubenswrapper[4835]: I0319 10:48:01.765010 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" event={"ID":"2328b041-bba8-4288-973a-bc3b4f998713","Type":"ContainerStarted","Data":"3484c9852322006436e0cd0305b16e439c123f932f30d160e76f9d10b693ff09"} Mar 19 10:48:03 crc kubenswrapper[4835]: I0319 10:48:03.789155 4835 generic.go:334] "Generic (PLEG): container finished" podID="2328b041-bba8-4288-973a-bc3b4f998713" containerID="1a1afb45e1c8b61862618c91e4b9ab157fd82066089d4926dc68843ca76a710c" exitCode=0 Mar 19 10:48:03 crc kubenswrapper[4835]: I0319 10:48:03.789216 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" event={"ID":"2328b041-bba8-4288-973a-bc3b4f998713","Type":"ContainerDied","Data":"1a1afb45e1c8b61862618c91e4b9ab157fd82066089d4926dc68843ca76a710c"} Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.245585 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.304420 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcn45\" (UniqueName: \"kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45\") pod \"2328b041-bba8-4288-973a-bc3b4f998713\" (UID: \"2328b041-bba8-4288-973a-bc3b4f998713\") " Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.309904 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45" (OuterVolumeSpecName: "kube-api-access-gcn45") pod "2328b041-bba8-4288-973a-bc3b4f998713" (UID: "2328b041-bba8-4288-973a-bc3b4f998713"). InnerVolumeSpecName "kube-api-access-gcn45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.407333 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcn45\" (UniqueName: \"kubernetes.io/projected/2328b041-bba8-4288-973a-bc3b4f998713-kube-api-access-gcn45\") on node \"crc\" DevicePath \"\"" Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.816067 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" event={"ID":"2328b041-bba8-4288-973a-bc3b4f998713","Type":"ContainerDied","Data":"3484c9852322006436e0cd0305b16e439c123f932f30d160e76f9d10b693ff09"} Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.816336 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3484c9852322006436e0cd0305b16e439c123f932f30d160e76f9d10b693ff09" Mar 19 10:48:05 crc kubenswrapper[4835]: I0319 10:48:05.816190 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-tr2vs" Mar 19 10:48:06 crc kubenswrapper[4835]: I0319 10:48:06.328997 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-llqff"] Mar 19 10:48:06 crc kubenswrapper[4835]: I0319 10:48:06.339648 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-llqff"] Mar 19 10:48:06 crc kubenswrapper[4835]: I0319 10:48:06.415805 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7551a433-d892-46ce-addb-cbeb079c9ad3" path="/var/lib/kubelet/pods/7551a433-d892-46ce-addb-cbeb079c9ad3/volumes" Mar 19 10:48:08 crc kubenswrapper[4835]: I0319 10:48:08.402233 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:48:08 crc kubenswrapper[4835]: E0319 10:48:08.403209 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:48:09 crc kubenswrapper[4835]: I0319 10:48:09.230570 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/util/0.log" Mar 19 10:48:09 crc kubenswrapper[4835]: I0319 10:48:09.839827 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/util/0.log" Mar 19 10:48:09 crc kubenswrapper[4835]: I0319 10:48:09.875679 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/pull/0.log" Mar 19 10:48:09 crc kubenswrapper[4835]: I0319 10:48:09.913323 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/pull/0.log" Mar 19 10:48:10 crc kubenswrapper[4835]: I0319 10:48:10.143084 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/util/0.log" Mar 19 10:48:10 crc kubenswrapper[4835]: I0319 10:48:10.143777 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/pull/0.log" Mar 19 10:48:10 crc kubenswrapper[4835]: I0319 10:48:10.155480 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b1a425611dcffea4bdc52468892221cbd081cf04996a6218aa1a191e1b9w9w6_a8c35d20-da98-4fdc-9e0b-1d3dea603055/extract/0.log" Mar 19 10:48:10 crc kubenswrapper[4835]: I0319 10:48:10.851916 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6cc65c69fc-bzhnq_5db66af1-4929-4163-b788-0320b0323a79/manager/0.log" Mar 19 10:48:11 crc kubenswrapper[4835]: I0319 10:48:11.294858 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7d559dcdbd-9rtgc_9eeda748-4895-4cbc-be03-d1edabf69758/manager/0.log" Mar 19 10:48:11 crc kubenswrapper[4835]: I0319 10:48:11.737955 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-66dd9d474d-bjnww_47fe8883-3cea-4985-8327-4ad721d4e128/manager/0.log" Mar 19 10:48:11 crc kubenswrapper[4835]: I0319 10:48:11.782419 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-64dc66d669-msxkb_6dabbeda-38a8-4289-92b6-075d7c20f958/manager/0.log" Mar 19 10:48:12 crc kubenswrapper[4835]: I0319 10:48:12.453564 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6b77b7676d-mgjjt_aea377c2-d271-45c9-a574-0d1fe89caac5/manager/0.log" Mar 19 10:48:12 crc kubenswrapper[4835]: I0319 10:48:12.766251 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5595c7d6ff-zmwm8_5ea1b1ba-826f-4abe-9c56-caedc3a178f9/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.088535 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-fbf7bbb96-v6mc6_a73c4956-ad63-40d9-907e-599172ac3771/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.150327 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-76b87776c9-79r7z_0f2be57b-1212-4dc2-b772-3317bcd69aac/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.447586 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6f5b7bcd4-d79pg_e86ab409-7643-4e6f-9129-6f90e5b6bf1c/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.759047 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6d77645966-dhr9f_0b55921b-5660-44b9-abe9-09639766068c/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.953691 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-bc5c78db9-wvb5n_2173f655-b411-4dfe-8247-59ac02942c25/manager/0.log" Mar 19 10:48:13 crc kubenswrapper[4835]: I0319 10:48:13.994009 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6744dd545c-4nqvq_2e429fe7-b053-43e9-a640-f14af7094e62/manager/0.log" Mar 19 10:48:14 crc kubenswrapper[4835]: I0319 10:48:14.267578 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-56f74467c6-v6bxs_02720ef5-c00c-48a4-af96-4d82f28bf051/manager/0.log" Mar 19 10:48:14 crc kubenswrapper[4835]: I0319 10:48:14.346423 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-575wd_017e5e75-5952-4240-8349-d16c367d4bed/manager/0.log" Mar 19 10:48:14 crc kubenswrapper[4835]: I0319 10:48:14.740319 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7pjhg_d95e8e04-5d17-45e5-aeb8-d6857bfc9d01/registry-server/1.log" Mar 19 10:48:14 crc kubenswrapper[4835]: I0319 10:48:14.867188 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7pjhg_d95e8e04-5d17-45e5-aeb8-d6857bfc9d01/registry-server/0.log" Mar 19 10:48:15 crc kubenswrapper[4835]: I0319 10:48:15.034275 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6674476dff-54wn4_2021ffd4-3c7f-477a-8816-c9b382d640c7/operator/0.log" Mar 19 10:48:15 crc kubenswrapper[4835]: I0319 10:48:15.221632 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-846c4cdcb7-7b7zt_74161dd4-df25-4f1b-8924-c9086688463d/manager/0.log" Mar 19 10:48:15 crc kubenswrapper[4835]: I0319 10:48:15.280510 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-659fb58c6b-vq45h_dd4067de-94b9-4d91-bdf4-e3e3af91b76f/manager/0.log" Mar 19 10:48:15 crc kubenswrapper[4835]: I0319 10:48:15.535264 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-z9wxq_5ab66f1b-4361-44ac-a885-93bbb8cd51d5/operator/0.log" Mar 19 10:48:15 crc kubenswrapper[4835]: I0319 10:48:15.753762 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-867f54bc44-sgjfn_637282e5-c56d-49ea-ac96-f299fb3661f2/manager/0.log" Mar 19 10:48:16 crc kubenswrapper[4835]: I0319 10:48:16.345606 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-s94g8_91c2a119-b77e-4cf1-9be3-779c47d4643b/manager/0.log" Mar 19 10:48:16 crc kubenswrapper[4835]: I0319 10:48:16.563534 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-57b8dbd499-gz2nz_65b9e1e9-a183-4d37-a281-593752da6125/manager/0.log" Mar 19 10:48:16 crc kubenswrapper[4835]: I0319 10:48:16.634998 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-74d6f7b5c-64vf7_a59c67ae-bcf7-404f-ad32-233f38450f65/manager/0.log" Mar 19 10:48:17 crc kubenswrapper[4835]: I0319 10:48:17.387343 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-68f9d5b675-fbgr4_8a05d0dc-7092-45aa-b869-34a23cb0a1f9/manager/0.log" Mar 19 10:48:21 crc kubenswrapper[4835]: I0319 10:48:21.401675 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:48:21 crc kubenswrapper[4835]: E0319 10:48:21.403155 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:48:22 crc kubenswrapper[4835]: I0319 10:48:22.556831 4835 scope.go:117] "RemoveContainer" containerID="0dbd4ad611789d462a3c4ce7e7ea46855d6928423d6f8d7ef03bbd47852ba1c3" Mar 19 10:48:23 crc kubenswrapper[4835]: I0319 10:48:23.603966 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5cfd84c587-t9dbc_e01500c9-ffdf-42df-9017-26b813efed0e/manager/0.log" Mar 19 10:48:34 crc kubenswrapper[4835]: I0319 10:48:34.402389 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:48:34 crc kubenswrapper[4835]: E0319 10:48:34.403704 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:48:41 crc kubenswrapper[4835]: I0319 10:48:41.992889 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v55z2_5fdabffe-d734-4b9a-8fea-d1608dcef7a2/control-plane-machine-set-operator/0.log" Mar 19 10:48:42 crc kubenswrapper[4835]: I0319 10:48:42.218414 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw4cg_4e16135b-13e3-4236-9fa7-94ceb99131e2/machine-api-operator/0.log" Mar 19 10:48:42 crc kubenswrapper[4835]: I0319 10:48:42.266330 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pw4cg_4e16135b-13e3-4236-9fa7-94ceb99131e2/kube-rbac-proxy/0.log" Mar 19 10:48:48 crc kubenswrapper[4835]: I0319 10:48:48.402599 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:48:48 crc kubenswrapper[4835]: E0319 10:48:48.403482 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:48:57 crc kubenswrapper[4835]: I0319 10:48:57.118221 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-s8pcl_f0524d82-6920-4648-a4a8-d25a5c5a75f3/cert-manager-controller/0.log" Mar 19 10:48:57 crc kubenswrapper[4835]: I0319 10:48:57.243891 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8pv2v_e4392ff6-094c-408b-aa06-c6b6133c9941/cert-manager-cainjector/0.log" Mar 19 10:48:57 crc kubenswrapper[4835]: I0319 10:48:57.344241 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rd6zz_3cb1a290-5e83-4b84-80ab-a7d86e85ce98/cert-manager-webhook/0.log" Mar 19 10:49:02 crc kubenswrapper[4835]: I0319 10:49:02.404355 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:49:02 crc kubenswrapper[4835]: E0319 10:49:02.405419 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:49:12 crc kubenswrapper[4835]: I0319 10:49:12.994231 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-wtgrg_9b6aeb7d-73be-42b9-93ef-72be8a26f4f7/nmstate-console-plugin/0.log" Mar 19 10:49:13 crc kubenswrapper[4835]: I0319 10:49:13.054361 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-69n76_86182cb8-0ad6-40f7-ac1c-7ccf8cf5bc13/nmstate-handler/0.log" Mar 19 10:49:13 crc kubenswrapper[4835]: I0319 10:49:13.199659 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8gq4d_514e58bd-bd21-45bf-8543-90439bf1c630/nmstate-metrics/0.log" Mar 19 10:49:13 crc kubenswrapper[4835]: I0319 10:49:13.230758 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-8gq4d_514e58bd-bd21-45bf-8543-90439bf1c630/kube-rbac-proxy/0.log" Mar 19 10:49:13 crc kubenswrapper[4835]: I0319 10:49:13.373068 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-m8zlt_74033176-5a6a-4727-9f45-9083cb1dbd3e/nmstate-operator/0.log" Mar 19 10:49:13 crc kubenswrapper[4835]: I0319 10:49:13.460491 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-zpbgp_a5ae9a03-b125-4072-9ec8-fddd19b7002e/nmstate-webhook/0.log" Mar 19 10:49:17 crc kubenswrapper[4835]: I0319 10:49:17.402009 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:49:17 crc kubenswrapper[4835]: E0319 10:49:17.403713 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:49:27 crc kubenswrapper[4835]: I0319 10:49:27.114294 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-668b645cb5-fhzgr_3a2991b5-2e25-4afa-9941-d955aad0dc37/manager/0.log" Mar 19 10:49:27 crc kubenswrapper[4835]: I0319 10:49:27.140689 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-668b645cb5-fhzgr_3a2991b5-2e25-4afa-9941-d955aad0dc37/kube-rbac-proxy/0.log" Mar 19 10:49:29 crc kubenswrapper[4835]: I0319 10:49:29.402308 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:49:29 crc kubenswrapper[4835]: E0319 10:49:29.403091 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:49:40 crc kubenswrapper[4835]: I0319 10:49:40.404490 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:49:40 crc kubenswrapper[4835]: E0319 10:49:40.406483 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.436052 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-mbzhh_210e8b64-36bc-4abd-8620-43093946855b/prometheus-operator/0.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.543980 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_73f088be-6bad-4a18-8353-10475ad7105d/prometheus-operator-admission-webhook/0.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.640687 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c467569cb-l992d_f26c226a-c8fa-4ff7-bda6-7eece297dd86/prometheus-operator-admission-webhook/0.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.802188 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-qsz5n_95fe9c35-69f6-4b60-a725-c2f0d8a34c99/operator/1.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.815997 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-qsz5n_95fe9c35-69f6-4b60-a725-c2f0d8a34c99/operator/0.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.905512 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-pjl7f_22c392c5-b6d9-42e8-bf59-122a846c26a4/observability-ui-dashboards/0.log" Mar 19 10:49:41 crc kubenswrapper[4835]: I0319 10:49:41.993855 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54476d58cc-x6mtx_41d1090f-7ab4-4820-a742-dca791692d0f/perses-operator/0.log" Mar 19 10:49:51 crc kubenswrapper[4835]: I0319 10:49:51.402704 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:49:51 crc kubenswrapper[4835]: E0319 10:49:51.403556 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:49:59 crc kubenswrapper[4835]: I0319 10:49:59.910116 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-vjzsc_435b5515-e762-42eb-b71f-afaee5fddc4a/cluster-logging-operator/0.log" Mar 19 10:49:59 crc kubenswrapper[4835]: I0319 10:49:59.959756 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-txhkk_84ff8168-f128-476f-9eef-a5b976025bed/collector/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.111965 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_a0a07ea9-b7e5-4679-9b63-c6027e42f279/loki-compactor/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.155621 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565290-knmhk"] Mar 19 10:50:00 crc kubenswrapper[4835]: E0319 10:50:00.156277 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2328b041-bba8-4288-973a-bc3b4f998713" containerName="oc" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.156305 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="2328b041-bba8-4288-973a-bc3b4f998713" containerName="oc" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.156659 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="2328b041-bba8-4288-973a-bc3b4f998713" containerName="oc" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.157820 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.160791 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.161209 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.161390 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.206585 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-rznlv_82e92071-6291-4ff2-971a-c658d2e001ed/loki-distributor/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.226974 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-knmhk"] Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.309875 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv478\" (UniqueName: \"kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478\") pod \"auto-csr-approver-29565290-knmhk\" (UID: \"0ee233ed-4515-45d1-bef3-7af33a67f4b2\") " pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.353989 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d45f4dcf6-4f49c_2ff291e5-8364-4627-be0f-51c9532e46ee/gateway/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.391901 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d45f4dcf6-4f49c_2ff291e5-8364-4627-be0f-51c9532e46ee/opa/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.413110 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv478\" (UniqueName: \"kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478\") pod \"auto-csr-approver-29565290-knmhk\" (UID: \"0ee233ed-4515-45d1-bef3-7af33a67f4b2\") " pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.434904 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv478\" (UniqueName: \"kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478\") pod \"auto-csr-approver-29565290-knmhk\" (UID: \"0ee233ed-4515-45d1-bef3-7af33a67f4b2\") " pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.476955 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.555094 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d45f4dcf6-hkx57_75ce358f-5f03-401f-bdf8-27a7e5309227/opa/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.569981 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5d45f4dcf6-hkx57_75ce358f-5f03-401f-bdf8-27a7e5309227/gateway/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.677576 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_b54655c1-accb-4df5-98b0-f01dbcfe83f7/loki-index-gateway/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.875616 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_b6f256cf-c5bb-4e9b-aa0b-c1d1737c1d57/loki-ingester/0.log" Mar 19 10:50:00 crc kubenswrapper[4835]: I0319 10:50:00.944808 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-cm5jv_9b90ec5a-5d56-4267-bb4e-1fbcdecff021/loki-querier/0.log" Mar 19 10:50:01 crc kubenswrapper[4835]: I0319 10:50:01.105097 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-5qfh9_e0bc0a8b-4a2d-4460-b984-7a2279e1a424/loki-query-frontend/0.log" Mar 19 10:50:01 crc kubenswrapper[4835]: I0319 10:50:01.212555 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-knmhk"] Mar 19 10:50:01 crc kubenswrapper[4835]: I0319 10:50:01.217996 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:50:01 crc kubenswrapper[4835]: I0319 10:50:01.235463 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-knmhk" event={"ID":"0ee233ed-4515-45d1-bef3-7af33a67f4b2","Type":"ContainerStarted","Data":"bc01d234eb254f22a0040cc996947a406bf8a2161499c7880eeca63b268f1b21"} Mar 19 10:50:03 crc kubenswrapper[4835]: I0319 10:50:03.300412 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-knmhk" event={"ID":"0ee233ed-4515-45d1-bef3-7af33a67f4b2","Type":"ContainerStarted","Data":"4bceae484cc675e16230845d6b9e60d8f94f7ff537823ea80718dcec33f2c127"} Mar 19 10:50:03 crc kubenswrapper[4835]: I0319 10:50:03.322876 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565290-knmhk" podStartSLOduration=2.196066835 podStartE2EDuration="3.32286066s" podCreationTimestamp="2026-03-19 10:50:00 +0000 UTC" firstStartedPulling="2026-03-19 10:50:01.213816613 +0000 UTC m=+5256.062415200" lastFinishedPulling="2026-03-19 10:50:02.340610428 +0000 UTC m=+5257.189209025" observedRunningTime="2026-03-19 10:50:03.319002025 +0000 UTC m=+5258.167600612" watchObservedRunningTime="2026-03-19 10:50:03.32286066 +0000 UTC m=+5258.171459247" Mar 19 10:50:04 crc kubenswrapper[4835]: I0319 10:50:04.311873 4835 generic.go:334] "Generic (PLEG): container finished" podID="0ee233ed-4515-45d1-bef3-7af33a67f4b2" containerID="4bceae484cc675e16230845d6b9e60d8f94f7ff537823ea80718dcec33f2c127" exitCode=0 Mar 19 10:50:04 crc kubenswrapper[4835]: I0319 10:50:04.311971 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-knmhk" event={"ID":"0ee233ed-4515-45d1-bef3-7af33a67f4b2","Type":"ContainerDied","Data":"4bceae484cc675e16230845d6b9e60d8f94f7ff537823ea80718dcec33f2c127"} Mar 19 10:50:05 crc kubenswrapper[4835]: I0319 10:50:05.402546 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:50:05 crc kubenswrapper[4835]: E0319 10:50:05.403139 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:50:05 crc kubenswrapper[4835]: I0319 10:50:05.876075 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:05 crc kubenswrapper[4835]: I0319 10:50:05.992142 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv478\" (UniqueName: \"kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478\") pod \"0ee233ed-4515-45d1-bef3-7af33a67f4b2\" (UID: \"0ee233ed-4515-45d1-bef3-7af33a67f4b2\") " Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.008018 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478" (OuterVolumeSpecName: "kube-api-access-zv478") pod "0ee233ed-4515-45d1-bef3-7af33a67f4b2" (UID: "0ee233ed-4515-45d1-bef3-7af33a67f4b2"). InnerVolumeSpecName "kube-api-access-zv478". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.098486 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv478\" (UniqueName: \"kubernetes.io/projected/0ee233ed-4515-45d1-bef3-7af33a67f4b2-kube-api-access-zv478\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.337244 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-knmhk" event={"ID":"0ee233ed-4515-45d1-bef3-7af33a67f4b2","Type":"ContainerDied","Data":"bc01d234eb254f22a0040cc996947a406bf8a2161499c7880eeca63b268f1b21"} Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.337570 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc01d234eb254f22a0040cc996947a406bf8a2161499c7880eeca63b268f1b21" Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.337309 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-knmhk" Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.421274 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-7pfb8"] Mar 19 10:50:06 crc kubenswrapper[4835]: I0319 10:50:06.426846 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-7pfb8"] Mar 19 10:50:08 crc kubenswrapper[4835]: I0319 10:50:08.414981 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ea73bf-ce42-4431-a66c-867f50ddb0c5" path="/var/lib/kubelet/pods/f9ea73bf-ce42-4431-a66c-867f50ddb0c5/volumes" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.249769 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-47sfq_3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6/kube-rbac-proxy/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.402797 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:50:17 crc kubenswrapper[4835]: E0319 10:50:17.403539 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.467799 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-47sfq_3d3ab7f2-b252-4ef7-8a11-5af68bf23fa6/controller/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.528648 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-frr-files/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.760627 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-frr-files/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.783384 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-reloader/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.785750 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-reloader/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.794474 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-metrics/0.log" Mar 19 10:50:17 crc kubenswrapper[4835]: I0319 10:50:17.964416 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-frr-files/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.008232 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-metrics/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.010337 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-reloader/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.050315 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-metrics/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.215453 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-metrics/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.256985 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-reloader/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.279035 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/cp-frr-files/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.305184 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/controller/1.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.461649 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/controller/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.548842 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/frr-metrics/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.577943 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/frr/1.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.664184 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/kube-rbac-proxy/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.764224 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/kube-rbac-proxy-frr/0.log" Mar 19 10:50:18 crc kubenswrapper[4835]: I0319 10:50:18.841256 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/reloader/0.log" Mar 19 10:50:19 crc kubenswrapper[4835]: I0319 10:50:19.770783 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-g4bvm_9b388096-29e9-4547-b43a-ec3a7935572b/frr-k8s-webhook-server/1.log" Mar 19 10:50:19 crc kubenswrapper[4835]: I0319 10:50:19.803087 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-g4bvm_9b388096-29e9-4547-b43a-ec3a7935572b/frr-k8s-webhook-server/0.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.016162 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85c8677745-k75vc_2a19bff3-1be0-44d7-b625-df3e46afa290/manager/0.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.124368 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6889f84cf4-bbfgf_817b88ed-bcd4-4702-bd72-7d04de779c86/webhook-server/1.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.264478 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6889f84cf4-bbfgf_817b88ed-bcd4-4702-bd72-7d04de779c86/webhook-server/0.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.286364 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-44rxf_8173c77e-48ec-44fc-9be7-67381528f78a/frr/0.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.339991 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfhzl_3f751b70-2b62-41da-b71c-ce7039840e3e/kube-rbac-proxy/0.log" Mar 19 10:50:20 crc kubenswrapper[4835]: I0319 10:50:20.784898 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfhzl_3f751b70-2b62-41da-b71c-ce7039840e3e/speaker/1.log" Mar 19 10:50:21 crc kubenswrapper[4835]: I0319 10:50:21.057178 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfhzl_3f751b70-2b62-41da-b71c-ce7039840e3e/speaker/0.log" Mar 19 10:50:22 crc kubenswrapper[4835]: I0319 10:50:22.704250 4835 scope.go:117] "RemoveContainer" containerID="fbbec1a85a8a16ad79debf3dba8b048c55c0f329af54936da178aec9f1f5e04b" Mar 19 10:50:31 crc kubenswrapper[4835]: I0319 10:50:31.402336 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:50:31 crc kubenswrapper[4835]: E0319 10:50:31.403305 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.514178 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/util/0.log" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.765467 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/pull/0.log" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.765983 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/pull/0.log" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.771154 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/util/0.log" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.953982 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/pull/0.log" Mar 19 10:50:35 crc kubenswrapper[4835]: I0319 10:50:35.972664 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/util/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.047358 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748vhlk_86081666-b147-49d0-bed1-df369027ca65/extract/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.171870 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/util/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.412905 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/pull/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.413293 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/util/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.440251 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/pull/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.615485 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/util/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.630725 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/extract/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.665676 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1br58r_f580e4fd-993c-4b3a-a95f-7b1e66ca13ac/pull/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.824553 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/util/0.log" Mar 19 10:50:36 crc kubenswrapper[4835]: I0319 10:50:36.994822 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/util/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.052067 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.054748 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.204471 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/util/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.216034 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.252253 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5xkbcx_c0252447-92bc-4bc8-a2fe-0be1262b6c77/extract/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.417061 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/util/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.592693 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.614479 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.638324 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/util/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.808951 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/util/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.849529 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/pull/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.898337 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cm9ccf_a59f270d-4331-4428-a35a-5426a9cb3676/extract/0.log" Mar 19 10:50:37 crc kubenswrapper[4835]: I0319 10:50:37.986157 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/util/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.165596 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/util/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.187923 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/pull/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.194258 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/pull/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.436061 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/util/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.461302 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/extract/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.484514 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726jwhwq_88674731-ebd8-49fa-b947-3143524a738c/pull/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.646235 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-utilities/0.log" Mar 19 10:50:38 crc kubenswrapper[4835]: I0319 10:50:38.992602 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-content/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.016994 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-content/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.043126 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-utilities/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.259814 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-utilities/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.267036 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/extract-content/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.358206 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/registry-server/1.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.556328 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-utilities/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.748553 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-utilities/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.805143 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-content/0.log" Mar 19 10:50:39 crc kubenswrapper[4835]: I0319 10:50:39.817315 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-content/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.005834 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-bmh95_8b78cdf3-88ba-4ab2-9966-492863d9206c/registry-server/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.076125 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-utilities/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.134784 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/extract-content/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.224189 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/registry-server/1.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.361297 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bzw9_dd4d4255-8149-4b0e-b3f2-dc0951a043a5/marketplace-operator/1.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.394017 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bzw9_dd4d4255-8149-4b0e-b3f2-dc0951a043a5/marketplace-operator/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.570765 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6mgg7_e8f17f81-d3ac-4c40-b346-c3eac9cc70d2/registry-server/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.601634 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-utilities/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.798828 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-content/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.826675 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-content/0.log" Mar 19 10:50:40 crc kubenswrapper[4835]: I0319 10:50:40.826848 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-utilities/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.024528 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-content/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.062925 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/extract-utilities/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.259512 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-utilities/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.278019 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/registry-server/1.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.332028 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-l2vsx_5d586c5b-694e-4ac9-aa09-0d973cdad7e0/registry-server/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.458501 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-content/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.499424 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-content/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.506152 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-utilities/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.695423 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-utilities/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.716941 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/extract-content/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.917802 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/registry-server/0.log" Mar 19 10:50:41 crc kubenswrapper[4835]: I0319 10:50:41.937324 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-n8kcm_2e9b0984-cab2-4aab-bad4-b4ad7040a40f/registry-server/1.log" Mar 19 10:50:42 crc kubenswrapper[4835]: I0319 10:50:42.403100 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:50:42 crc kubenswrapper[4835]: E0319 10:50:42.403413 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.311659 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-mbzhh_210e8b64-36bc-4abd-8620-43093946855b/prometheus-operator/0.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.329979 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c467569cb-4zx5b_73f088be-6bad-4a18-8353-10475ad7105d/prometheus-operator-admission-webhook/0.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.330405 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-c467569cb-l992d_f26c226a-c8fa-4ff7-bda6-7eece297dd86/prometheus-operator-admission-webhook/0.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.417087 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:50:56 crc kubenswrapper[4835]: E0319 10:50:56.417489 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.522029 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-qsz5n_95fe9c35-69f6-4b60-a725-c2f0d8a34c99/operator/0.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.524347 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54476d58cc-x6mtx_41d1090f-7ab4-4820-a742-dca791692d0f/perses-operator/0.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.528840 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-qsz5n_95fe9c35-69f6-4b60-a725-c2f0d8a34c99/operator/1.log" Mar 19 10:50:56 crc kubenswrapper[4835]: I0319 10:50:56.566953 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-pjl7f_22c392c5-b6d9-42e8-bf59-122a846c26a4/observability-ui-dashboards/0.log" Mar 19 10:51:11 crc kubenswrapper[4835]: I0319 10:51:11.402467 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:51:11 crc kubenswrapper[4835]: E0319 10:51:11.403379 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:51:12 crc kubenswrapper[4835]: I0319 10:51:12.677754 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-668b645cb5-fhzgr_3a2991b5-2e25-4afa-9941-d955aad0dc37/kube-rbac-proxy/0.log" Mar 19 10:51:12 crc kubenswrapper[4835]: I0319 10:51:12.688136 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-668b645cb5-fhzgr_3a2991b5-2e25-4afa-9941-d955aad0dc37/manager/0.log" Mar 19 10:51:25 crc kubenswrapper[4835]: I0319 10:51:25.403107 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:51:25 crc kubenswrapper[4835]: E0319 10:51:25.403985 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:51:37 crc kubenswrapper[4835]: I0319 10:51:37.402567 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:51:37 crc kubenswrapper[4835]: E0319 10:51:37.403515 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:51:50 crc kubenswrapper[4835]: I0319 10:51:50.401899 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:51:50 crc kubenswrapper[4835]: E0319 10:51:50.402888 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.148366 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565292-8lc78"] Mar 19 10:52:00 crc kubenswrapper[4835]: E0319 10:52:00.149545 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee233ed-4515-45d1-bef3-7af33a67f4b2" containerName="oc" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.149563 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee233ed-4515-45d1-bef3-7af33a67f4b2" containerName="oc" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.149890 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee233ed-4515-45d1-bef3-7af33a67f4b2" containerName="oc" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.150925 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.154898 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.155032 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.155192 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.165892 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-8lc78"] Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.210431 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6gg2\" (UniqueName: \"kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2\") pod \"auto-csr-approver-29565292-8lc78\" (UID: \"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c\") " pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.312029 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6gg2\" (UniqueName: \"kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2\") pod \"auto-csr-approver-29565292-8lc78\" (UID: \"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c\") " pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.346688 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6gg2\" (UniqueName: \"kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2\") pod \"auto-csr-approver-29565292-8lc78\" (UID: \"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c\") " pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:00 crc kubenswrapper[4835]: I0319 10:52:00.477205 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:01 crc kubenswrapper[4835]: I0319 10:52:01.228358 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-8lc78"] Mar 19 10:52:01 crc kubenswrapper[4835]: W0319 10:52:01.240336 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3448cfa6_ed67_4b4d_883d_6b6ebad9ef7c.slice/crio-7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86 WatchSource:0}: Error finding container 7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86: Status 404 returned error can't find the container with id 7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86 Mar 19 10:52:01 crc kubenswrapper[4835]: I0319 10:52:01.822765 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-8lc78" event={"ID":"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c","Type":"ContainerStarted","Data":"7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86"} Mar 19 10:52:02 crc kubenswrapper[4835]: I0319 10:52:02.838041 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-8lc78" event={"ID":"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c","Type":"ContainerStarted","Data":"aa78f5d6cbabd721a39ceb4667d4ce9e3704dd1c3dd9608faaaff1ff3813c5d3"} Mar 19 10:52:02 crc kubenswrapper[4835]: I0319 10:52:02.856568 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565292-8lc78" podStartSLOduration=1.94696738 podStartE2EDuration="2.856549644s" podCreationTimestamp="2026-03-19 10:52:00 +0000 UTC" firstStartedPulling="2026-03-19 10:52:01.241592314 +0000 UTC m=+5376.090190901" lastFinishedPulling="2026-03-19 10:52:02.151174568 +0000 UTC m=+5376.999773165" observedRunningTime="2026-03-19 10:52:02.855119696 +0000 UTC m=+5377.703718313" watchObservedRunningTime="2026-03-19 10:52:02.856549644 +0000 UTC m=+5377.705148241" Mar 19 10:52:03 crc kubenswrapper[4835]: I0319 10:52:03.404008 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:52:03 crc kubenswrapper[4835]: E0319 10:52:03.404507 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:52:04 crc kubenswrapper[4835]: I0319 10:52:04.895462 4835 generic.go:334] "Generic (PLEG): container finished" podID="3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" containerID="aa78f5d6cbabd721a39ceb4667d4ce9e3704dd1c3dd9608faaaff1ff3813c5d3" exitCode=0 Mar 19 10:52:04 crc kubenswrapper[4835]: I0319 10:52:04.895572 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-8lc78" event={"ID":"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c","Type":"ContainerDied","Data":"aa78f5d6cbabd721a39ceb4667d4ce9e3704dd1c3dd9608faaaff1ff3813c5d3"} Mar 19 10:52:06 crc kubenswrapper[4835]: I0319 10:52:06.944138 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-8lc78" event={"ID":"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c","Type":"ContainerDied","Data":"7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86"} Mar 19 10:52:06 crc kubenswrapper[4835]: I0319 10:52:06.944918 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe5f3eb20c6e63d16bf937efe6b9efaf928fa1c37c0ff4e9c39331335b73b86" Mar 19 10:52:07 crc kubenswrapper[4835]: I0319 10:52:07.055777 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:07 crc kubenswrapper[4835]: I0319 10:52:07.151909 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6gg2\" (UniqueName: \"kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2\") pod \"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c\" (UID: \"3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c\") " Mar 19 10:52:07 crc kubenswrapper[4835]: I0319 10:52:07.166355 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2" (OuterVolumeSpecName: "kube-api-access-v6gg2") pod "3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" (UID: "3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c"). InnerVolumeSpecName "kube-api-access-v6gg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:52:07 crc kubenswrapper[4835]: I0319 10:52:07.256651 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6gg2\" (UniqueName: \"kubernetes.io/projected/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c-kube-api-access-v6gg2\") on node \"crc\" DevicePath \"\"" Mar 19 10:52:07 crc kubenswrapper[4835]: I0319 10:52:07.953624 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-8lc78" Mar 19 10:52:08 crc kubenswrapper[4835]: I0319 10:52:08.153444 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-8q757"] Mar 19 10:52:08 crc kubenswrapper[4835]: I0319 10:52:08.173578 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-8q757"] Mar 19 10:52:08 crc kubenswrapper[4835]: I0319 10:52:08.431619 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8" path="/var/lib/kubelet/pods/a125d9a9-12c8-4c9c-8cbe-eefae2fd91e8/volumes" Mar 19 10:52:16 crc kubenswrapper[4835]: I0319 10:52:16.414427 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:52:17 crc kubenswrapper[4835]: I0319 10:52:17.086381 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01"} Mar 19 10:52:22 crc kubenswrapper[4835]: I0319 10:52:22.836060 4835 scope.go:117] "RemoveContainer" containerID="31c49db545e80737889cd9983386a237245ecf3e2e70b117ec8a6da26b1a6fa4" Mar 19 10:52:22 crc kubenswrapper[4835]: I0319 10:52:22.901119 4835 scope.go:117] "RemoveContainer" containerID="aef605412de6df9fc282dff82e300dbc65385f644226bb771c97fec2df4250e5" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.799660 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:17 crc kubenswrapper[4835]: E0319 10:53:17.800890 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" containerName="oc" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.800906 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" containerName="oc" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.801202 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" containerName="oc" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.813073 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.834893 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.934709 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.934950 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spt9d\" (UniqueName: \"kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:17 crc kubenswrapper[4835]: I0319 10:53:17.936727 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.038644 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.038750 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.038836 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spt9d\" (UniqueName: \"kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.039167 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.039686 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.065095 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spt9d\" (UniqueName: \"kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d\") pod \"community-operators-27c99\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.132635 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.657471 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:18 crc kubenswrapper[4835]: I0319 10:53:18.875401 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerStarted","Data":"d6b6837c853be187a2663a1f5ff494107ef0a3728bddd81c0f42413e70f24150"} Mar 19 10:53:19 crc kubenswrapper[4835]: I0319 10:53:19.886481 4835 generic.go:334] "Generic (PLEG): container finished" podID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerID="eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0" exitCode=0 Mar 19 10:53:19 crc kubenswrapper[4835]: I0319 10:53:19.886551 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerDied","Data":"eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0"} Mar 19 10:53:20 crc kubenswrapper[4835]: I0319 10:53:20.904760 4835 generic.go:334] "Generic (PLEG): container finished" podID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerID="9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9" exitCode=0 Mar 19 10:53:20 crc kubenswrapper[4835]: I0319 10:53:20.904848 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bjtgn/must-gather-27lsm" event={"ID":"33211b2a-7ffa-44b3-b5ac-66891d2e56c1","Type":"ContainerDied","Data":"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9"} Mar 19 10:53:20 crc kubenswrapper[4835]: I0319 10:53:20.905931 4835 scope.go:117] "RemoveContainer" containerID="9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9" Mar 19 10:53:21 crc kubenswrapper[4835]: I0319 10:53:21.785356 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjtgn_must-gather-27lsm_33211b2a-7ffa-44b3-b5ac-66891d2e56c1/gather/0.log" Mar 19 10:53:21 crc kubenswrapper[4835]: I0319 10:53:21.917647 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerStarted","Data":"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc"} Mar 19 10:53:22 crc kubenswrapper[4835]: I0319 10:53:22.933920 4835 generic.go:334] "Generic (PLEG): container finished" podID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerID="e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc" exitCode=0 Mar 19 10:53:22 crc kubenswrapper[4835]: I0319 10:53:22.933972 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerDied","Data":"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc"} Mar 19 10:53:23 crc kubenswrapper[4835]: I0319 10:53:23.010670 4835 scope.go:117] "RemoveContainer" containerID="84162b84376d5ec7317c4ed4ecfbfd20e4350399d0d59f84cf5116466c629aa8" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.143840 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.147641 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.173674 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.218514 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.218584 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87n5v\" (UniqueName: \"kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.218909 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.320346 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.320436 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87n5v\" (UniqueName: \"kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.320624 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.320993 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.321095 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.342947 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87n5v\" (UniqueName: \"kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v\") pod \"certified-operators-x6x7s\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.477452 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.960031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerStarted","Data":"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10"} Mar 19 10:53:24 crc kubenswrapper[4835]: W0319 10:53:24.981732 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f5cf63_8df4_4e8d_a603_3c6839de5248.slice/crio-c7251cb1693e093e91d6eed61be61577e22eac52398ee7c15f793639d43a6551 WatchSource:0}: Error finding container c7251cb1693e093e91d6eed61be61577e22eac52398ee7c15f793639d43a6551: Status 404 returned error can't find the container with id c7251cb1693e093e91d6eed61be61577e22eac52398ee7c15f793639d43a6551 Mar 19 10:53:24 crc kubenswrapper[4835]: I0319 10:53:24.989869 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:25 crc kubenswrapper[4835]: I0319 10:53:25.000283 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-27c99" podStartSLOduration=3.186392283 podStartE2EDuration="8.00026489s" podCreationTimestamp="2026-03-19 10:53:17 +0000 UTC" firstStartedPulling="2026-03-19 10:53:19.88934498 +0000 UTC m=+5454.737943577" lastFinishedPulling="2026-03-19 10:53:24.703217597 +0000 UTC m=+5459.551816184" observedRunningTime="2026-03-19 10:53:24.985496498 +0000 UTC m=+5459.834095085" watchObservedRunningTime="2026-03-19 10:53:25.00026489 +0000 UTC m=+5459.848863477" Mar 19 10:53:25 crc kubenswrapper[4835]: I0319 10:53:25.975226 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerID="729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4" exitCode=0 Mar 19 10:53:25 crc kubenswrapper[4835]: I0319 10:53:25.975288 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerDied","Data":"729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4"} Mar 19 10:53:25 crc kubenswrapper[4835]: I0319 10:53:25.976007 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerStarted","Data":"c7251cb1693e093e91d6eed61be61577e22eac52398ee7c15f793639d43a6551"} Mar 19 10:53:26 crc kubenswrapper[4835]: I0319 10:53:26.993378 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerStarted","Data":"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8"} Mar 19 10:53:28 crc kubenswrapper[4835]: I0319 10:53:28.134337 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:28 crc kubenswrapper[4835]: I0319 10:53:28.134705 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:29 crc kubenswrapper[4835]: I0319 10:53:29.017525 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerID="ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8" exitCode=0 Mar 19 10:53:29 crc kubenswrapper[4835]: I0319 10:53:29.017627 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerDied","Data":"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8"} Mar 19 10:53:29 crc kubenswrapper[4835]: I0319 10:53:29.188159 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-27c99" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" probeResult="failure" output=< Mar 19 10:53:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:53:29 crc kubenswrapper[4835]: > Mar 19 10:53:30 crc kubenswrapper[4835]: I0319 10:53:30.032832 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerStarted","Data":"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505"} Mar 19 10:53:30 crc kubenswrapper[4835]: I0319 10:53:30.067427 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6x7s" podStartSLOduration=2.61478878 podStartE2EDuration="6.067377458s" podCreationTimestamp="2026-03-19 10:53:24 +0000 UTC" firstStartedPulling="2026-03-19 10:53:25.977800924 +0000 UTC m=+5460.826399511" lastFinishedPulling="2026-03-19 10:53:29.430389602 +0000 UTC m=+5464.278988189" observedRunningTime="2026-03-19 10:53:30.062100154 +0000 UTC m=+5464.910698731" watchObservedRunningTime="2026-03-19 10:53:30.067377458 +0000 UTC m=+5464.915976065" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.156175 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bjtgn/must-gather-27lsm"] Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.158132 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bjtgn/must-gather-27lsm" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="copy" containerID="cri-o://35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25" gracePeriod=2 Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.172072 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bjtgn/must-gather-27lsm"] Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.666328 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjtgn_must-gather-27lsm_33211b2a-7ffa-44b3-b5ac-66891d2e56c1/copy/0.log" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.666993 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.728351 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output\") pod \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.728648 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z48h\" (UniqueName: \"kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h\") pod \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\" (UID: \"33211b2a-7ffa-44b3-b5ac-66891d2e56c1\") " Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.734629 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h" (OuterVolumeSpecName: "kube-api-access-9z48h") pod "33211b2a-7ffa-44b3-b5ac-66891d2e56c1" (UID: "33211b2a-7ffa-44b3-b5ac-66891d2e56c1"). InnerVolumeSpecName "kube-api-access-9z48h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.832558 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z48h\" (UniqueName: \"kubernetes.io/projected/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-kube-api-access-9z48h\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.870541 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "33211b2a-7ffa-44b3-b5ac-66891d2e56c1" (UID: "33211b2a-7ffa-44b3-b5ac-66891d2e56c1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:53:32 crc kubenswrapper[4835]: I0319 10:53:32.935404 4835 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33211b2a-7ffa-44b3-b5ac-66891d2e56c1-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.069947 4835 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bjtgn_must-gather-27lsm_33211b2a-7ffa-44b3-b5ac-66891d2e56c1/copy/0.log" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.070391 4835 generic.go:334] "Generic (PLEG): container finished" podID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerID="35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25" exitCode=143 Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.070449 4835 scope.go:117] "RemoveContainer" containerID="35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.070464 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bjtgn/must-gather-27lsm" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.110797 4835 scope.go:117] "RemoveContainer" containerID="9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.252309 4835 scope.go:117] "RemoveContainer" containerID="35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25" Mar 19 10:53:33 crc kubenswrapper[4835]: E0319 10:53:33.283726 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25\": container with ID starting with 35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25 not found: ID does not exist" containerID="35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.283805 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25"} err="failed to get container status \"35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25\": rpc error: code = NotFound desc = could not find container \"35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25\": container with ID starting with 35d93b6fe5f41a17b129b88ff3aacddec40d6cda20329d7a0b8a5df82e906e25 not found: ID does not exist" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.283833 4835 scope.go:117] "RemoveContainer" containerID="9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9" Mar 19 10:53:33 crc kubenswrapper[4835]: E0319 10:53:33.284334 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9\": container with ID starting with 9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9 not found: ID does not exist" containerID="9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9" Mar 19 10:53:33 crc kubenswrapper[4835]: I0319 10:53:33.284358 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9"} err="failed to get container status \"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9\": rpc error: code = NotFound desc = could not find container \"9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9\": container with ID starting with 9a38fc50fd9e830ff4e2b59f6aceed39137081a688a36bbec3e3318c70b416b9 not found: ID does not exist" Mar 19 10:53:34 crc kubenswrapper[4835]: I0319 10:53:34.428079 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" path="/var/lib/kubelet/pods/33211b2a-7ffa-44b3-b5ac-66891d2e56c1/volumes" Mar 19 10:53:34 crc kubenswrapper[4835]: I0319 10:53:34.478190 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:34 crc kubenswrapper[4835]: I0319 10:53:34.478229 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:35 crc kubenswrapper[4835]: I0319 10:53:35.526262 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x6x7s" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="registry-server" probeResult="failure" output=< Mar 19 10:53:35 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:53:35 crc kubenswrapper[4835]: > Mar 19 10:53:39 crc kubenswrapper[4835]: I0319 10:53:39.254580 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-27c99" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" probeResult="failure" output=< Mar 19 10:53:39 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:53:39 crc kubenswrapper[4835]: > Mar 19 10:53:45 crc kubenswrapper[4835]: I0319 10:53:45.240583 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:45 crc kubenswrapper[4835]: I0319 10:53:45.310120 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:45 crc kubenswrapper[4835]: I0319 10:53:45.490713 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.288126 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6x7s" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="registry-server" containerID="cri-o://2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505" gracePeriod=2 Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.920691 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.937368 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87n5v\" (UniqueName: \"kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v\") pod \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.937482 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content\") pod \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.937779 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities\") pod \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\" (UID: \"e9f5cf63-8df4-4e8d-a603-3c6839de5248\") " Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.939593 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities" (OuterVolumeSpecName: "utilities") pod "e9f5cf63-8df4-4e8d-a603-3c6839de5248" (UID: "e9f5cf63-8df4-4e8d-a603-3c6839de5248"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:53:47 crc kubenswrapper[4835]: I0319 10:53:47.949616 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v" (OuterVolumeSpecName: "kube-api-access-87n5v") pod "e9f5cf63-8df4-4e8d-a603-3c6839de5248" (UID: "e9f5cf63-8df4-4e8d-a603-3c6839de5248"). InnerVolumeSpecName "kube-api-access-87n5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.044119 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.044154 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87n5v\" (UniqueName: \"kubernetes.io/projected/e9f5cf63-8df4-4e8d-a603-3c6839de5248-kube-api-access-87n5v\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.072910 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9f5cf63-8df4-4e8d-a603-3c6839de5248" (UID: "e9f5cf63-8df4-4e8d-a603-3c6839de5248"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.147205 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f5cf63-8df4-4e8d-a603-3c6839de5248-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.220543 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.300023 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.313332 4835 generic.go:334] "Generic (PLEG): container finished" podID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerID="2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505" exitCode=0 Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.313440 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerDied","Data":"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505"} Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.313458 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x7s" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.313494 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x7s" event={"ID":"e9f5cf63-8df4-4e8d-a603-3c6839de5248","Type":"ContainerDied","Data":"c7251cb1693e093e91d6eed61be61577e22eac52398ee7c15f793639d43a6551"} Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.313517 4835 scope.go:117] "RemoveContainer" containerID="2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.338546 4835 scope.go:117] "RemoveContainer" containerID="ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.386161 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.397233 4835 scope.go:117] "RemoveContainer" containerID="729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.430151 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6x7s"] Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.453382 4835 scope.go:117] "RemoveContainer" containerID="2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505" Mar 19 10:53:48 crc kubenswrapper[4835]: E0319 10:53:48.454056 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505\": container with ID starting with 2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505 not found: ID does not exist" containerID="2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.454096 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505"} err="failed to get container status \"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505\": rpc error: code = NotFound desc = could not find container \"2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505\": container with ID starting with 2464315b1b71fad584d6c2ec5e8181796d3272cda2a713f5233a1e905d702505 not found: ID does not exist" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.454123 4835 scope.go:117] "RemoveContainer" containerID="ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8" Mar 19 10:53:48 crc kubenswrapper[4835]: E0319 10:53:48.454419 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8\": container with ID starting with ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8 not found: ID does not exist" containerID="ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.454447 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8"} err="failed to get container status \"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8\": rpc error: code = NotFound desc = could not find container \"ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8\": container with ID starting with ac32ef57959b64a4114e4040dfe0760f26ae715dc52285241ec4603a6d6b18b8 not found: ID does not exist" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.454460 4835 scope.go:117] "RemoveContainer" containerID="729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4" Mar 19 10:53:48 crc kubenswrapper[4835]: E0319 10:53:48.454757 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4\": container with ID starting with 729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4 not found: ID does not exist" containerID="729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4" Mar 19 10:53:48 crc kubenswrapper[4835]: I0319 10:53:48.454776 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4"} err="failed to get container status \"729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4\": rpc error: code = NotFound desc = could not find container \"729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4\": container with ID starting with 729f9f5ba69fdb7bb29e1795bd2a0eb02881c80340fd8291ea1013b081e065b4 not found: ID does not exist" Mar 19 10:53:49 crc kubenswrapper[4835]: I0319 10:53:49.287484 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:49 crc kubenswrapper[4835]: I0319 10:53:49.323083 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-27c99" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" containerID="cri-o://1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10" gracePeriod=2 Mar 19 10:53:49 crc kubenswrapper[4835]: I0319 10:53:49.918865 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.098931 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities\") pod \"53a76423-3313-4b62-96fc-9a98f45fbb6a\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.099016 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content\") pod \"53a76423-3313-4b62-96fc-9a98f45fbb6a\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.099144 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spt9d\" (UniqueName: \"kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d\") pod \"53a76423-3313-4b62-96fc-9a98f45fbb6a\" (UID: \"53a76423-3313-4b62-96fc-9a98f45fbb6a\") " Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.102130 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities" (OuterVolumeSpecName: "utilities") pod "53a76423-3313-4b62-96fc-9a98f45fbb6a" (UID: "53a76423-3313-4b62-96fc-9a98f45fbb6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.112451 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d" (OuterVolumeSpecName: "kube-api-access-spt9d") pod "53a76423-3313-4b62-96fc-9a98f45fbb6a" (UID: "53a76423-3313-4b62-96fc-9a98f45fbb6a"). InnerVolumeSpecName "kube-api-access-spt9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.164551 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53a76423-3313-4b62-96fc-9a98f45fbb6a" (UID: "53a76423-3313-4b62-96fc-9a98f45fbb6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.204370 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spt9d\" (UniqueName: \"kubernetes.io/projected/53a76423-3313-4b62-96fc-9a98f45fbb6a-kube-api-access-spt9d\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.204415 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.204429 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53a76423-3313-4b62-96fc-9a98f45fbb6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.341894 4835 generic.go:334] "Generic (PLEG): container finished" podID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerID="1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10" exitCode=0 Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.341994 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerDied","Data":"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10"} Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.342031 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-27c99" event={"ID":"53a76423-3313-4b62-96fc-9a98f45fbb6a","Type":"ContainerDied","Data":"d6b6837c853be187a2663a1f5ff494107ef0a3728bddd81c0f42413e70f24150"} Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.342071 4835 scope.go:117] "RemoveContainer" containerID="1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.342357 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-27c99" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.370173 4835 scope.go:117] "RemoveContainer" containerID="e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.420363 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" path="/var/lib/kubelet/pods/e9f5cf63-8df4-4e8d-a603-3c6839de5248/volumes" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.421352 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.421378 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-27c99"] Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.434983 4835 scope.go:117] "RemoveContainer" containerID="eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.489559 4835 scope.go:117] "RemoveContainer" containerID="1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10" Mar 19 10:53:50 crc kubenswrapper[4835]: E0319 10:53:50.490146 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10\": container with ID starting with 1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10 not found: ID does not exist" containerID="1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.490183 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10"} err="failed to get container status \"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10\": rpc error: code = NotFound desc = could not find container \"1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10\": container with ID starting with 1f68740ed7aabe6f9ea48e10f37f03fac31f9e4ca4c1238f405707b943f74d10 not found: ID does not exist" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.490219 4835 scope.go:117] "RemoveContainer" containerID="e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc" Mar 19 10:53:50 crc kubenswrapper[4835]: E0319 10:53:50.490535 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc\": container with ID starting with e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc not found: ID does not exist" containerID="e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.490563 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc"} err="failed to get container status \"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc\": rpc error: code = NotFound desc = could not find container \"e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc\": container with ID starting with e0fce8bc005d08f8620e4856b76bb8f96bbe7f1f81a9096399375d85d4a35cfc not found: ID does not exist" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.490580 4835 scope.go:117] "RemoveContainer" containerID="eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0" Mar 19 10:53:50 crc kubenswrapper[4835]: E0319 10:53:50.490922 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0\": container with ID starting with eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0 not found: ID does not exist" containerID="eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0" Mar 19 10:53:50 crc kubenswrapper[4835]: I0319 10:53:50.490955 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0"} err="failed to get container status \"eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0\": rpc error: code = NotFound desc = could not find container \"eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0\": container with ID starting with eb2822be1e242a65a4490783e665823f996f7524358a52374aae455f8c32a8f0 not found: ID does not exist" Mar 19 10:53:52 crc kubenswrapper[4835]: I0319 10:53:52.424785 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" path="/var/lib/kubelet/pods/53a76423-3313-4b62-96fc-9a98f45fbb6a/volumes" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.167987 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565294-rr6l2"] Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169487 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169506 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169542 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="extract-utilities" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169552 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="extract-utilities" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169579 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="extract-content" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169588 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="extract-content" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169613 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="extract-utilities" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169622 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="extract-utilities" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169652 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="extract-content" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169661 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="extract-content" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169681 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="gather" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169691 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="gather" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169708 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169717 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: E0319 10:54:00.169763 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="copy" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.169773 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="copy" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.170083 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="copy" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.170184 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a76423-3313-4b62-96fc-9a98f45fbb6a" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.170210 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="33211b2a-7ffa-44b3-b5ac-66891d2e56c1" containerName="gather" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.170237 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f5cf63-8df4-4e8d-a603-3c6839de5248" containerName="registry-server" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.171472 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.175363 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.175502 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.175579 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.178237 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-rr6l2"] Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.269619 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fnmb\" (UniqueName: \"kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb\") pod \"auto-csr-approver-29565294-rr6l2\" (UID: \"4ffb3e93-cb55-4408-bdd3-716858ecf5df\") " pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.371658 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fnmb\" (UniqueName: \"kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb\") pod \"auto-csr-approver-29565294-rr6l2\" (UID: \"4ffb3e93-cb55-4408-bdd3-716858ecf5df\") " pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.388883 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fnmb\" (UniqueName: \"kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb\") pod \"auto-csr-approver-29565294-rr6l2\" (UID: \"4ffb3e93-cb55-4408-bdd3-716858ecf5df\") " pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:00 crc kubenswrapper[4835]: I0319 10:54:00.567066 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:01 crc kubenswrapper[4835]: I0319 10:54:01.075439 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-rr6l2"] Mar 19 10:54:01 crc kubenswrapper[4835]: W0319 10:54:01.085143 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ffb3e93_cb55_4408_bdd3_716858ecf5df.slice/crio-6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8 WatchSource:0}: Error finding container 6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8: Status 404 returned error can't find the container with id 6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8 Mar 19 10:54:01 crc kubenswrapper[4835]: I0319 10:54:01.460452 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" event={"ID":"4ffb3e93-cb55-4408-bdd3-716858ecf5df","Type":"ContainerStarted","Data":"6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8"} Mar 19 10:54:02 crc kubenswrapper[4835]: I0319 10:54:02.473907 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" event={"ID":"4ffb3e93-cb55-4408-bdd3-716858ecf5df","Type":"ContainerStarted","Data":"6e8e2c8cf7c57d842da8280b247c5322ba0912a94c7642297556938940a5f816"} Mar 19 10:54:02 crc kubenswrapper[4835]: I0319 10:54:02.498230 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" podStartSLOduration=1.614466473 podStartE2EDuration="2.498209111s" podCreationTimestamp="2026-03-19 10:54:00 +0000 UTC" firstStartedPulling="2026-03-19 10:54:01.089383868 +0000 UTC m=+5495.937982455" lastFinishedPulling="2026-03-19 10:54:01.973126506 +0000 UTC m=+5496.821725093" observedRunningTime="2026-03-19 10:54:02.488632511 +0000 UTC m=+5497.337231098" watchObservedRunningTime="2026-03-19 10:54:02.498209111 +0000 UTC m=+5497.346807698" Mar 19 10:54:03 crc kubenswrapper[4835]: I0319 10:54:03.494929 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ffb3e93-cb55-4408-bdd3-716858ecf5df" containerID="6e8e2c8cf7c57d842da8280b247c5322ba0912a94c7642297556938940a5f816" exitCode=0 Mar 19 10:54:03 crc kubenswrapper[4835]: I0319 10:54:03.494993 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" event={"ID":"4ffb3e93-cb55-4408-bdd3-716858ecf5df","Type":"ContainerDied","Data":"6e8e2c8cf7c57d842da8280b247c5322ba0912a94c7642297556938940a5f816"} Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.014772 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.106985 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fnmb\" (UniqueName: \"kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb\") pod \"4ffb3e93-cb55-4408-bdd3-716858ecf5df\" (UID: \"4ffb3e93-cb55-4408-bdd3-716858ecf5df\") " Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.119660 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb" (OuterVolumeSpecName: "kube-api-access-5fnmb") pod "4ffb3e93-cb55-4408-bdd3-716858ecf5df" (UID: "4ffb3e93-cb55-4408-bdd3-716858ecf5df"). InnerVolumeSpecName "kube-api-access-5fnmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.210693 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fnmb\" (UniqueName: \"kubernetes.io/projected/4ffb3e93-cb55-4408-bdd3-716858ecf5df-kube-api-access-5fnmb\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.521119 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" event={"ID":"4ffb3e93-cb55-4408-bdd3-716858ecf5df","Type":"ContainerDied","Data":"6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8"} Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.521164 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c3b415dd84dd9b31cdf3799b079b66e4a815be59cba40059aa34ad7f6af71f8" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.521225 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-rr6l2" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.588143 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-tr2vs"] Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.603529 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-tr2vs"] Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.686131 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:05 crc kubenswrapper[4835]: E0319 10:54:05.688690 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffb3e93-cb55-4408-bdd3-716858ecf5df" containerName="oc" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.688767 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffb3e93-cb55-4408-bdd3-716858ecf5df" containerName="oc" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.689144 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffb3e93-cb55-4408-bdd3-716858ecf5df" containerName="oc" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.691518 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.721419 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.825908 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.825972 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qsh9\" (UniqueName: \"kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.826140 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.928462 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.928587 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.928621 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qsh9\" (UniqueName: \"kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.929089 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.929327 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:05 crc kubenswrapper[4835]: I0319 10:54:05.953267 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qsh9\" (UniqueName: \"kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9\") pod \"redhat-operators-m6nv5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:06 crc kubenswrapper[4835]: I0319 10:54:06.017558 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:06 crc kubenswrapper[4835]: I0319 10:54:06.417082 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2328b041-bba8-4288-973a-bc3b4f998713" path="/var/lib/kubelet/pods/2328b041-bba8-4288-973a-bc3b4f998713/volumes" Mar 19 10:54:06 crc kubenswrapper[4835]: I0319 10:54:06.612399 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:07 crc kubenswrapper[4835]: I0319 10:54:07.547397 4835 generic.go:334] "Generic (PLEG): container finished" podID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerID="0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2" exitCode=0 Mar 19 10:54:07 crc kubenswrapper[4835]: I0319 10:54:07.547495 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerDied","Data":"0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2"} Mar 19 10:54:07 crc kubenswrapper[4835]: I0319 10:54:07.547815 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerStarted","Data":"2a4ef495a42fd1b827799bb6a28de49a6913f081cd1ef35125f846aef2e14019"} Mar 19 10:54:09 crc kubenswrapper[4835]: I0319 10:54:09.590612 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerStarted","Data":"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88"} Mar 19 10:54:13 crc kubenswrapper[4835]: I0319 10:54:13.725068 4835 generic.go:334] "Generic (PLEG): container finished" podID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerID="eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88" exitCode=0 Mar 19 10:54:13 crc kubenswrapper[4835]: I0319 10:54:13.725399 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerDied","Data":"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88"} Mar 19 10:54:14 crc kubenswrapper[4835]: I0319 10:54:14.748544 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerStarted","Data":"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255"} Mar 19 10:54:14 crc kubenswrapper[4835]: I0319 10:54:14.770184 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6nv5" podStartSLOduration=3.104671435 podStartE2EDuration="9.770155261s" podCreationTimestamp="2026-03-19 10:54:05 +0000 UTC" firstStartedPulling="2026-03-19 10:54:07.550101615 +0000 UTC m=+5502.398700222" lastFinishedPulling="2026-03-19 10:54:14.215585461 +0000 UTC m=+5509.064184048" observedRunningTime="2026-03-19 10:54:14.763898791 +0000 UTC m=+5509.612497388" watchObservedRunningTime="2026-03-19 10:54:14.770155261 +0000 UTC m=+5509.618753868" Mar 19 10:54:16 crc kubenswrapper[4835]: I0319 10:54:16.019245 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:16 crc kubenswrapper[4835]: I0319 10:54:16.019605 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:17 crc kubenswrapper[4835]: I0319 10:54:17.076081 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6nv5" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" probeResult="failure" output=< Mar 19 10:54:17 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:54:17 crc kubenswrapper[4835]: > Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.118815 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.121567 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.145364 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.174583 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.174806 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc2nj\" (UniqueName: \"kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.174879 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.277918 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc2nj\" (UniqueName: \"kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.278071 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.278583 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.279211 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.279259 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.306961 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc2nj\" (UniqueName: \"kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj\") pod \"redhat-marketplace-gvhmw\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:18 crc kubenswrapper[4835]: I0319 10:54:18.462889 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:19 crc kubenswrapper[4835]: I0319 10:54:19.086953 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:19 crc kubenswrapper[4835]: I0319 10:54:19.836398 4835 generic.go:334] "Generic (PLEG): container finished" podID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerID="2978051d796889d0e8eb4c2ea777adca7c927405da249c7cb178b82b6f5a44f0" exitCode=0 Mar 19 10:54:19 crc kubenswrapper[4835]: I0319 10:54:19.836444 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerDied","Data":"2978051d796889d0e8eb4c2ea777adca7c927405da249c7cb178b82b6f5a44f0"} Mar 19 10:54:19 crc kubenswrapper[4835]: I0319 10:54:19.836474 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerStarted","Data":"d2d2417b796108b9cd67643a7739be08af4a59bf3aa19455ed0ca8536d989b3f"} Mar 19 10:54:21 crc kubenswrapper[4835]: I0319 10:54:21.869915 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerStarted","Data":"421b4a50b74a670de6d8ff4c73fb4cdece4bc0bc83715add4dd92e0d94a71e9a"} Mar 19 10:54:22 crc kubenswrapper[4835]: I0319 10:54:22.882837 4835 generic.go:334] "Generic (PLEG): container finished" podID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerID="421b4a50b74a670de6d8ff4c73fb4cdece4bc0bc83715add4dd92e0d94a71e9a" exitCode=0 Mar 19 10:54:22 crc kubenswrapper[4835]: I0319 10:54:22.882894 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerDied","Data":"421b4a50b74a670de6d8ff4c73fb4cdece4bc0bc83715add4dd92e0d94a71e9a"} Mar 19 10:54:23 crc kubenswrapper[4835]: I0319 10:54:23.084489 4835 scope.go:117] "RemoveContainer" containerID="1a1afb45e1c8b61862618c91e4b9ab157fd82066089d4926dc68843ca76a710c" Mar 19 10:54:24 crc kubenswrapper[4835]: I0319 10:54:24.905245 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerStarted","Data":"a2bf8fc56becbc534abcfa2ac1e6d6a1d8a78d09141aa07ee52c862e2c69abd8"} Mar 19 10:54:24 crc kubenswrapper[4835]: I0319 10:54:24.931263 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gvhmw" podStartSLOduration=2.975828139 podStartE2EDuration="6.931242788s" podCreationTimestamp="2026-03-19 10:54:18 +0000 UTC" firstStartedPulling="2026-03-19 10:54:19.83843643 +0000 UTC m=+5514.687035037" lastFinishedPulling="2026-03-19 10:54:23.793851099 +0000 UTC m=+5518.642449686" observedRunningTime="2026-03-19 10:54:24.919507078 +0000 UTC m=+5519.768105665" watchObservedRunningTime="2026-03-19 10:54:24.931242788 +0000 UTC m=+5519.779841375" Mar 19 10:54:27 crc kubenswrapper[4835]: I0319 10:54:27.093616 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6nv5" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" probeResult="failure" output=< Mar 19 10:54:27 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:54:27 crc kubenswrapper[4835]: > Mar 19 10:54:28 crc kubenswrapper[4835]: I0319 10:54:28.463061 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:28 crc kubenswrapper[4835]: I0319 10:54:28.463721 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:29 crc kubenswrapper[4835]: I0319 10:54:29.567254 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gvhmw" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="registry-server" probeResult="failure" output=< Mar 19 10:54:29 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:54:29 crc kubenswrapper[4835]: > Mar 19 10:54:36 crc kubenswrapper[4835]: I0319 10:54:36.422850 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:54:36 crc kubenswrapper[4835]: I0319 10:54:36.423609 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:54:37 crc kubenswrapper[4835]: I0319 10:54:37.077089 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6nv5" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" probeResult="failure" output=< Mar 19 10:54:37 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:54:37 crc kubenswrapper[4835]: > Mar 19 10:54:38 crc kubenswrapper[4835]: I0319 10:54:38.657143 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:38 crc kubenswrapper[4835]: I0319 10:54:38.705873 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:38 crc kubenswrapper[4835]: I0319 10:54:38.914849 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:40 crc kubenswrapper[4835]: I0319 10:54:40.090334 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gvhmw" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="registry-server" containerID="cri-o://a2bf8fc56becbc534abcfa2ac1e6d6a1d8a78d09141aa07ee52c862e2c69abd8" gracePeriod=2 Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.105629 4835 generic.go:334] "Generic (PLEG): container finished" podID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerID="a2bf8fc56becbc534abcfa2ac1e6d6a1d8a78d09141aa07ee52c862e2c69abd8" exitCode=0 Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.105756 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerDied","Data":"a2bf8fc56becbc534abcfa2ac1e6d6a1d8a78d09141aa07ee52c862e2c69abd8"} Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.106016 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gvhmw" event={"ID":"070bb6eb-db06-4636-87bf-3a2c7a8086e6","Type":"ContainerDied","Data":"d2d2417b796108b9cd67643a7739be08af4a59bf3aa19455ed0ca8536d989b3f"} Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.106043 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d2417b796108b9cd67643a7739be08af4a59bf3aa19455ed0ca8536d989b3f" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.207083 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.294528 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content\") pod \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.294783 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc2nj\" (UniqueName: \"kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj\") pod \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.294811 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities\") pod \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\" (UID: \"070bb6eb-db06-4636-87bf-3a2c7a8086e6\") " Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.295670 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities" (OuterVolumeSpecName: "utilities") pod "070bb6eb-db06-4636-87bf-3a2c7a8086e6" (UID: "070bb6eb-db06-4636-87bf-3a2c7a8086e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.310434 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj" (OuterVolumeSpecName: "kube-api-access-sc2nj") pod "070bb6eb-db06-4636-87bf-3a2c7a8086e6" (UID: "070bb6eb-db06-4636-87bf-3a2c7a8086e6"). InnerVolumeSpecName "kube-api-access-sc2nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.324265 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "070bb6eb-db06-4636-87bf-3a2c7a8086e6" (UID: "070bb6eb-db06-4636-87bf-3a2c7a8086e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.397966 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.397995 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc2nj\" (UniqueName: \"kubernetes.io/projected/070bb6eb-db06-4636-87bf-3a2c7a8086e6-kube-api-access-sc2nj\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:41 crc kubenswrapper[4835]: I0319 10:54:41.398005 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/070bb6eb-db06-4636-87bf-3a2c7a8086e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:42 crc kubenswrapper[4835]: I0319 10:54:42.117064 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gvhmw" Mar 19 10:54:42 crc kubenswrapper[4835]: I0319 10:54:42.161587 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:42 crc kubenswrapper[4835]: I0319 10:54:42.172566 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gvhmw"] Mar 19 10:54:42 crc kubenswrapper[4835]: I0319 10:54:42.425476 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" path="/var/lib/kubelet/pods/070bb6eb-db06-4636-87bf-3a2c7a8086e6/volumes" Mar 19 10:54:47 crc kubenswrapper[4835]: I0319 10:54:47.081851 4835 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6nv5" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" probeResult="failure" output=< Mar 19 10:54:47 crc kubenswrapper[4835]: timeout: failed to connect service ":50051" within 1s Mar 19 10:54:47 crc kubenswrapper[4835]: > Mar 19 10:54:56 crc kubenswrapper[4835]: I0319 10:54:56.086956 4835 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:56 crc kubenswrapper[4835]: I0319 10:54:56.162344 4835 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:56 crc kubenswrapper[4835]: I0319 10:54:56.346843 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.296169 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m6nv5" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" containerID="cri-o://4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255" gracePeriod=2 Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.837877 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.950344 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qsh9\" (UniqueName: \"kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9\") pod \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.950667 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities\") pod \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.950889 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content\") pod \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\" (UID: \"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5\") " Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.951186 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities" (OuterVolumeSpecName: "utilities") pod "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" (UID: "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.955679 4835 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:57 crc kubenswrapper[4835]: I0319 10:54:57.980975 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9" (OuterVolumeSpecName: "kube-api-access-6qsh9") pod "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" (UID: "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5"). InnerVolumeSpecName "kube-api-access-6qsh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.059277 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qsh9\" (UniqueName: \"kubernetes.io/projected/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-kube-api-access-6qsh9\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.123831 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" (UID: "1fc55486-5a3d-417b-aa8e-a1cec1a60ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.161795 4835 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.314211 4835 generic.go:334] "Generic (PLEG): container finished" podID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerID="4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255" exitCode=0 Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.314258 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerDied","Data":"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255"} Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.314294 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6nv5" event={"ID":"1fc55486-5a3d-417b-aa8e-a1cec1a60ae5","Type":"ContainerDied","Data":"2a4ef495a42fd1b827799bb6a28de49a6913f081cd1ef35125f846aef2e14019"} Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.314298 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6nv5" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.314313 4835 scope.go:117] "RemoveContainer" containerID="4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.343440 4835 scope.go:117] "RemoveContainer" containerID="eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.356570 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.367044 4835 scope.go:117] "RemoveContainer" containerID="0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.368124 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m6nv5"] Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.420185 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" path="/var/lib/kubelet/pods/1fc55486-5a3d-417b-aa8e-a1cec1a60ae5/volumes" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.429478 4835 scope.go:117] "RemoveContainer" containerID="4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255" Mar 19 10:54:58 crc kubenswrapper[4835]: E0319 10:54:58.429974 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255\": container with ID starting with 4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255 not found: ID does not exist" containerID="4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.430011 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255"} err="failed to get container status \"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255\": rpc error: code = NotFound desc = could not find container \"4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255\": container with ID starting with 4805b8c16e71839ca0e94ace2a4e62fc00a1b6ba1b2395e0203e1dabf47a6255 not found: ID does not exist" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.430033 4835 scope.go:117] "RemoveContainer" containerID="eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88" Mar 19 10:54:58 crc kubenswrapper[4835]: E0319 10:54:58.430397 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88\": container with ID starting with eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88 not found: ID does not exist" containerID="eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.430438 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88"} err="failed to get container status \"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88\": rpc error: code = NotFound desc = could not find container \"eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88\": container with ID starting with eb2aee765be82c1bdf4553a32e657eb615a3419a9ffc1b1b148737180c24fc88 not found: ID does not exist" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.430464 4835 scope.go:117] "RemoveContainer" containerID="0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2" Mar 19 10:54:58 crc kubenswrapper[4835]: E0319 10:54:58.430721 4835 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2\": container with ID starting with 0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2 not found: ID does not exist" containerID="0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2" Mar 19 10:54:58 crc kubenswrapper[4835]: I0319 10:54:58.430764 4835 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2"} err="failed to get container status \"0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2\": rpc error: code = NotFound desc = could not find container \"0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2\": container with ID starting with 0e712011819805940c035a1be9b066f63f27b085b5fab3683f3bc2c3e44bd7e2 not found: ID does not exist" Mar 19 10:55:06 crc kubenswrapper[4835]: I0319 10:55:06.422490 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:55:06 crc kubenswrapper[4835]: I0319 10:55:06.423117 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.422236 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.423120 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.425841 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.427874 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.428051 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01" gracePeriod=600 Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.805138 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01" exitCode=0 Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.805397 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01"} Mar 19 10:55:36 crc kubenswrapper[4835]: I0319 10:55:36.805710 4835 scope.go:117] "RemoveContainer" containerID="2999600d8da067b17effdd03f999dbcb19e52bd95cdcf24fbdbbde5c1f05fc02" Mar 19 10:55:37 crc kubenswrapper[4835]: I0319 10:55:37.824219 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerStarted","Data":"bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508"} Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.149329 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565296-q6cqz"] Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150316 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="extract-content" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150330 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="extract-content" Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150350 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="extract-utilities" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150358 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="extract-utilities" Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150382 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="extract-utilities" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150388 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="extract-utilities" Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150404 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="extract-content" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150410 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="extract-content" Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150434 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150440 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: E0319 10:56:00.150456 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150462 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150891 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="070bb6eb-db06-4636-87bf-3a2c7a8086e6" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.150942 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc55486-5a3d-417b-aa8e-a1cec1a60ae5" containerName="registry-server" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.151810 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.154513 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.155511 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.156401 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.159059 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-q6cqz"] Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.269478 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4bjm\" (UniqueName: \"kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm\") pod \"auto-csr-approver-29565296-q6cqz\" (UID: \"49d0f316-07b4-4d43-ae1f-ce15262eb275\") " pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.373554 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4bjm\" (UniqueName: \"kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm\") pod \"auto-csr-approver-29565296-q6cqz\" (UID: \"49d0f316-07b4-4d43-ae1f-ce15262eb275\") " pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.394325 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4bjm\" (UniqueName: \"kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm\") pod \"auto-csr-approver-29565296-q6cqz\" (UID: \"49d0f316-07b4-4d43-ae1f-ce15262eb275\") " pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:00 crc kubenswrapper[4835]: I0319 10:56:00.506208 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:01 crc kubenswrapper[4835]: I0319 10:56:01.085883 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-q6cqz"] Mar 19 10:56:01 crc kubenswrapper[4835]: I0319 10:56:01.102526 4835 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:56:02 crc kubenswrapper[4835]: I0319 10:56:02.117082 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" event={"ID":"49d0f316-07b4-4d43-ae1f-ce15262eb275","Type":"ContainerStarted","Data":"b33f6ef12c49ddd69ebab78324a40dcde0f67415369ef3177903083387ef9dfb"} Mar 19 10:56:04 crc kubenswrapper[4835]: I0319 10:56:04.143146 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" event={"ID":"49d0f316-07b4-4d43-ae1f-ce15262eb275","Type":"ContainerStarted","Data":"5ce9ca80bc04e896e891f0f59376f4d01efec0023dc33537c9bf56d49b3bcb6c"} Mar 19 10:56:04 crc kubenswrapper[4835]: I0319 10:56:04.160306 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" podStartSLOduration=3.255939606 podStartE2EDuration="4.160281866s" podCreationTimestamp="2026-03-19 10:56:00 +0000 UTC" firstStartedPulling="2026-03-19 10:56:01.10016431 +0000 UTC m=+5615.948762907" lastFinishedPulling="2026-03-19 10:56:02.00450654 +0000 UTC m=+5616.853105167" observedRunningTime="2026-03-19 10:56:04.157986473 +0000 UTC m=+5619.006585110" watchObservedRunningTime="2026-03-19 10:56:04.160281866 +0000 UTC m=+5619.008880453" Mar 19 10:56:05 crc kubenswrapper[4835]: I0319 10:56:05.158652 4835 generic.go:334] "Generic (PLEG): container finished" podID="49d0f316-07b4-4d43-ae1f-ce15262eb275" containerID="5ce9ca80bc04e896e891f0f59376f4d01efec0023dc33537c9bf56d49b3bcb6c" exitCode=0 Mar 19 10:56:05 crc kubenswrapper[4835]: I0319 10:56:05.158771 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" event={"ID":"49d0f316-07b4-4d43-ae1f-ce15262eb275","Type":"ContainerDied","Data":"5ce9ca80bc04e896e891f0f59376f4d01efec0023dc33537c9bf56d49b3bcb6c"} Mar 19 10:56:06 crc kubenswrapper[4835]: I0319 10:56:06.622039 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:06 crc kubenswrapper[4835]: I0319 10:56:06.658325 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4bjm\" (UniqueName: \"kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm\") pod \"49d0f316-07b4-4d43-ae1f-ce15262eb275\" (UID: \"49d0f316-07b4-4d43-ae1f-ce15262eb275\") " Mar 19 10:56:06 crc kubenswrapper[4835]: I0319 10:56:06.663894 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm" (OuterVolumeSpecName: "kube-api-access-d4bjm") pod "49d0f316-07b4-4d43-ae1f-ce15262eb275" (UID: "49d0f316-07b4-4d43-ae1f-ce15262eb275"). InnerVolumeSpecName "kube-api-access-d4bjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:06 crc kubenswrapper[4835]: I0319 10:56:06.761019 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4bjm\" (UniqueName: \"kubernetes.io/projected/49d0f316-07b4-4d43-ae1f-ce15262eb275-kube-api-access-d4bjm\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:07 crc kubenswrapper[4835]: I0319 10:56:07.192980 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" event={"ID":"49d0f316-07b4-4d43-ae1f-ce15262eb275","Type":"ContainerDied","Data":"b33f6ef12c49ddd69ebab78324a40dcde0f67415369ef3177903083387ef9dfb"} Mar 19 10:56:07 crc kubenswrapper[4835]: I0319 10:56:07.193441 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33f6ef12c49ddd69ebab78324a40dcde0f67415369ef3177903083387ef9dfb" Mar 19 10:56:07 crc kubenswrapper[4835]: I0319 10:56:07.193015 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-q6cqz" Mar 19 10:56:07 crc kubenswrapper[4835]: I0319 10:56:07.265932 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-knmhk"] Mar 19 10:56:07 crc kubenswrapper[4835]: I0319 10:56:07.282188 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-knmhk"] Mar 19 10:56:08 crc kubenswrapper[4835]: I0319 10:56:08.416671 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee233ed-4515-45d1-bef3-7af33a67f4b2" path="/var/lib/kubelet/pods/0ee233ed-4515-45d1-bef3-7af33a67f4b2/volumes" Mar 19 10:56:23 crc kubenswrapper[4835]: I0319 10:56:23.337615 4835 scope.go:117] "RemoveContainer" containerID="4bceae484cc675e16230845d6b9e60d8f94f7ff537823ea80718dcec33f2c127" Mar 19 10:57:36 crc kubenswrapper[4835]: I0319 10:57:36.422733 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:57:36 crc kubenswrapper[4835]: I0319 10:57:36.423335 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.148590 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565298-dxw66"] Mar 19 10:58:00 crc kubenswrapper[4835]: E0319 10:58:00.149846 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d0f316-07b4-4d43-ae1f-ce15262eb275" containerName="oc" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.149930 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d0f316-07b4-4d43-ae1f-ce15262eb275" containerName="oc" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.150187 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d0f316-07b4-4d43-ae1f-ce15262eb275" containerName="oc" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.151146 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.160295 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.160511 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.160672 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.164583 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-dxw66"] Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.227941 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vsm\" (UniqueName: \"kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm\") pod \"auto-csr-approver-29565298-dxw66\" (UID: \"a8bca798-c8cb-45c4-b275-f9a0d0af5a45\") " pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.330311 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76vsm\" (UniqueName: \"kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm\") pod \"auto-csr-approver-29565298-dxw66\" (UID: \"a8bca798-c8cb-45c4-b275-f9a0d0af5a45\") " pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.364701 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vsm\" (UniqueName: \"kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm\") pod \"auto-csr-approver-29565298-dxw66\" (UID: \"a8bca798-c8cb-45c4-b275-f9a0d0af5a45\") " pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:00 crc kubenswrapper[4835]: I0319 10:58:00.495808 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:01 crc kubenswrapper[4835]: I0319 10:58:01.021089 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-dxw66"] Mar 19 10:58:01 crc kubenswrapper[4835]: I0319 10:58:01.597603 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-dxw66" event={"ID":"a8bca798-c8cb-45c4-b275-f9a0d0af5a45","Type":"ContainerStarted","Data":"fd132938eb0e17cad828fe4e9b13ff32123dd56920fd54eefa35bde222d12760"} Mar 19 10:58:02 crc kubenswrapper[4835]: I0319 10:58:02.608574 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-dxw66" event={"ID":"a8bca798-c8cb-45c4-b275-f9a0d0af5a45","Type":"ContainerStarted","Data":"dc71f6bdb470b7025f98746a668e03e2f07f19d4000beb568ed7c588c20ce9df"} Mar 19 10:58:02 crc kubenswrapper[4835]: I0319 10:58:02.656967 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565298-dxw66" podStartSLOduration=1.383426034 podStartE2EDuration="2.656943331s" podCreationTimestamp="2026-03-19 10:58:00 +0000 UTC" firstStartedPulling="2026-03-19 10:58:01.026859532 +0000 UTC m=+5735.875458119" lastFinishedPulling="2026-03-19 10:58:02.300376829 +0000 UTC m=+5737.148975416" observedRunningTime="2026-03-19 10:58:02.638218882 +0000 UTC m=+5737.486817469" watchObservedRunningTime="2026-03-19 10:58:02.656943331 +0000 UTC m=+5737.505541908" Mar 19 10:58:03 crc kubenswrapper[4835]: I0319 10:58:03.626360 4835 generic.go:334] "Generic (PLEG): container finished" podID="a8bca798-c8cb-45c4-b275-f9a0d0af5a45" containerID="dc71f6bdb470b7025f98746a668e03e2f07f19d4000beb568ed7c588c20ce9df" exitCode=0 Mar 19 10:58:03 crc kubenswrapper[4835]: I0319 10:58:03.626492 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-dxw66" event={"ID":"a8bca798-c8cb-45c4-b275-f9a0d0af5a45","Type":"ContainerDied","Data":"dc71f6bdb470b7025f98746a668e03e2f07f19d4000beb568ed7c588c20ce9df"} Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.133330 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.269507 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76vsm\" (UniqueName: \"kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm\") pod \"a8bca798-c8cb-45c4-b275-f9a0d0af5a45\" (UID: \"a8bca798-c8cb-45c4-b275-f9a0d0af5a45\") " Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.274487 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm" (OuterVolumeSpecName: "kube-api-access-76vsm") pod "a8bca798-c8cb-45c4-b275-f9a0d0af5a45" (UID: "a8bca798-c8cb-45c4-b275-f9a0d0af5a45"). InnerVolumeSpecName "kube-api-access-76vsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.372537 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76vsm\" (UniqueName: \"kubernetes.io/projected/a8bca798-c8cb-45c4-b275-f9a0d0af5a45-kube-api-access-76vsm\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.648103 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-dxw66" event={"ID":"a8bca798-c8cb-45c4-b275-f9a0d0af5a45","Type":"ContainerDied","Data":"fd132938eb0e17cad828fe4e9b13ff32123dd56920fd54eefa35bde222d12760"} Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.648149 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd132938eb0e17cad828fe4e9b13ff32123dd56920fd54eefa35bde222d12760" Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.648235 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-dxw66" Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.716498 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-8lc78"] Mar 19 10:58:05 crc kubenswrapper[4835]: I0319 10:58:05.726766 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-8lc78"] Mar 19 10:58:06 crc kubenswrapper[4835]: I0319 10:58:06.421964 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:58:06 crc kubenswrapper[4835]: I0319 10:58:06.422261 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:58:06 crc kubenswrapper[4835]: I0319 10:58:06.430085 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c" path="/var/lib/kubelet/pods/3448cfa6-ed67-4b4d-883d-6b6ebad9ef7c/volumes" Mar 19 10:58:23 crc kubenswrapper[4835]: I0319 10:58:23.485832 4835 scope.go:117] "RemoveContainer" containerID="aa78f5d6cbabd721a39ceb4667d4ce9e3704dd1c3dd9608faaaff1ff3813c5d3" Mar 19 10:58:36 crc kubenswrapper[4835]: I0319 10:58:36.422344 4835 patch_prober.go:28] interesting pod/machine-config-daemon-bk84k container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:58:36 crc kubenswrapper[4835]: I0319 10:58:36.422999 4835 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:58:36 crc kubenswrapper[4835]: I0319 10:58:36.423037 4835 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" Mar 19 10:58:36 crc kubenswrapper[4835]: I0319 10:58:36.423967 4835 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508"} pod="openshift-machine-config-operator/machine-config-daemon-bk84k" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:58:36 crc kubenswrapper[4835]: I0319 10:58:36.424021 4835 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerName="machine-config-daemon" containerID="cri-o://bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" gracePeriod=600 Mar 19 10:58:36 crc kubenswrapper[4835]: E0319 10:58:36.548082 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:58:37 crc kubenswrapper[4835]: I0319 10:58:37.082411 4835 generic.go:334] "Generic (PLEG): container finished" podID="adf367e5-fedd-4d9e-a7af-345df1f08353" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" exitCode=0 Mar 19 10:58:37 crc kubenswrapper[4835]: I0319 10:58:37.082527 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" event={"ID":"adf367e5-fedd-4d9e-a7af-345df1f08353","Type":"ContainerDied","Data":"bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508"} Mar 19 10:58:37 crc kubenswrapper[4835]: I0319 10:58:37.082962 4835 scope.go:117] "RemoveContainer" containerID="9a2e2737a0b29536ee9ca76417bf67df18e5965e00950ba619c67e9ef25d8f01" Mar 19 10:58:37 crc kubenswrapper[4835]: I0319 10:58:37.084908 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:58:37 crc kubenswrapper[4835]: E0319 10:58:37.086000 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:58:52 crc kubenswrapper[4835]: I0319 10:58:52.402542 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:58:52 crc kubenswrapper[4835]: E0319 10:58:52.403149 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:58:56 crc kubenswrapper[4835]: I0319 10:58:56.092761 4835 trace.go:236] Trace[239138534]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (19-Mar-2026 10:58:54.975) (total time: 1117ms): Mar 19 10:58:56 crc kubenswrapper[4835]: Trace[239138534]: [1.117482605s] [1.117482605s] END Mar 19 10:59:07 crc kubenswrapper[4835]: I0319 10:59:07.402450 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:59:07 crc kubenswrapper[4835]: E0319 10:59:07.403724 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:59:22 crc kubenswrapper[4835]: I0319 10:59:22.402832 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:59:22 crc kubenswrapper[4835]: E0319 10:59:22.403612 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:59:34 crc kubenswrapper[4835]: I0319 10:59:34.402566 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:59:34 crc kubenswrapper[4835]: E0319 10:59:34.403283 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 10:59:49 crc kubenswrapper[4835]: I0319 10:59:49.402335 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 10:59:49 crc kubenswrapper[4835]: E0319 10:59:49.403350 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.165072 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565300-nsrbc"] Mar 19 11:00:00 crc kubenswrapper[4835]: E0319 11:00:00.166484 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8bca798-c8cb-45c4-b275-f9a0d0af5a45" containerName="oc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.166518 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8bca798-c8cb-45c4-b275-f9a0d0af5a45" containerName="oc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.166992 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8bca798-c8cb-45c4-b275-f9a0d0af5a45" containerName="oc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.168359 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.171144 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.171380 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.171542 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4qgw" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.183017 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt"] Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.184677 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.186134 4835 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.186474 4835 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.200167 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-nsrbc"] Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.226292 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt"] Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.286210 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gmw\" (UniqueName: \"kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.286341 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.286407 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtlj2\" (UniqueName: \"kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2\") pod \"auto-csr-approver-29565300-nsrbc\" (UID: \"4ab05505-ce46-49e9-94b9-231d72500c38\") " pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.286515 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.389591 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.389735 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gmw\" (UniqueName: \"kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.389894 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.389977 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtlj2\" (UniqueName: \"kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2\") pod \"auto-csr-approver-29565300-nsrbc\" (UID: \"4ab05505-ce46-49e9-94b9-231d72500c38\") " pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.391624 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.398520 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.406866 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gmw\" (UniqueName: \"kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw\") pod \"collect-profiles-29565300-v7xbt\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.410442 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtlj2\" (UniqueName: \"kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2\") pod \"auto-csr-approver-29565300-nsrbc\" (UID: \"4ab05505-ce46-49e9-94b9-231d72500c38\") " pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.491445 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:00 crc kubenswrapper[4835]: I0319 11:00:00.510776 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.114389 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt"] Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.127092 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-nsrbc"] Mar 19 11:00:01 crc kubenswrapper[4835]: W0319 11:00:01.131905 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ab05505_ce46_49e9_94b9_231d72500c38.slice/crio-072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717 WatchSource:0}: Error finding container 072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717: Status 404 returned error can't find the container with id 072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717 Mar 19 11:00:01 crc kubenswrapper[4835]: W0319 11:00:01.139663 4835 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b89ad1_470b_461d_88f4_aa9886535e32.slice/crio-2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a WatchSource:0}: Error finding container 2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a: Status 404 returned error can't find the container with id 2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.561214 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" event={"ID":"4ab05505-ce46-49e9-94b9-231d72500c38","Type":"ContainerStarted","Data":"072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717"} Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.565782 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" event={"ID":"d7b89ad1-470b-461d-88f4-aa9886535e32","Type":"ContainerStarted","Data":"38f888d3f102afb79526880daebf79ef54000408694653aff5de9825d37ede20"} Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.565813 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" event={"ID":"d7b89ad1-470b-461d-88f4-aa9886535e32","Type":"ContainerStarted","Data":"2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a"} Mar 19 11:00:01 crc kubenswrapper[4835]: I0319 11:00:01.595621 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" podStartSLOduration=1.595591134 podStartE2EDuration="1.595591134s" podCreationTimestamp="2026-03-19 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:00:01.58144406 +0000 UTC m=+5856.430042657" watchObservedRunningTime="2026-03-19 11:00:01.595591134 +0000 UTC m=+5856.444189751" Mar 19 11:00:02 crc kubenswrapper[4835]: I0319 11:00:02.580387 4835 generic.go:334] "Generic (PLEG): container finished" podID="d7b89ad1-470b-461d-88f4-aa9886535e32" containerID="38f888d3f102afb79526880daebf79ef54000408694653aff5de9825d37ede20" exitCode=0 Mar 19 11:00:02 crc kubenswrapper[4835]: I0319 11:00:02.580467 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" event={"ID":"d7b89ad1-470b-461d-88f4-aa9886535e32","Type":"ContainerDied","Data":"38f888d3f102afb79526880daebf79ef54000408694653aff5de9825d37ede20"} Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.170757 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.315108 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume\") pod \"d7b89ad1-470b-461d-88f4-aa9886535e32\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.315493 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gmw\" (UniqueName: \"kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw\") pod \"d7b89ad1-470b-461d-88f4-aa9886535e32\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.316040 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume\") pod \"d7b89ad1-470b-461d-88f4-aa9886535e32\" (UID: \"d7b89ad1-470b-461d-88f4-aa9886535e32\") " Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.316196 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7b89ad1-470b-461d-88f4-aa9886535e32" (UID: "d7b89ad1-470b-461d-88f4-aa9886535e32"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.317281 4835 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7b89ad1-470b-461d-88f4-aa9886535e32-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.322705 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw" (OuterVolumeSpecName: "kube-api-access-m2gmw") pod "d7b89ad1-470b-461d-88f4-aa9886535e32" (UID: "d7b89ad1-470b-461d-88f4-aa9886535e32"). InnerVolumeSpecName "kube-api-access-m2gmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.322819 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7b89ad1-470b-461d-88f4-aa9886535e32" (UID: "d7b89ad1-470b-461d-88f4-aa9886535e32"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.404854 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:00:04 crc kubenswrapper[4835]: E0319 11:00:04.405286 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.419880 4835 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7b89ad1-470b-461d-88f4-aa9886535e32-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.419912 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gmw\" (UniqueName: \"kubernetes.io/projected/d7b89ad1-470b-461d-88f4-aa9886535e32-kube-api-access-m2gmw\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.631903 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" event={"ID":"d7b89ad1-470b-461d-88f4-aa9886535e32","Type":"ContainerDied","Data":"2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a"} Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.632280 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d1619d0910c0a437b4d40e44a95362aeebc61b98410dbf5f91041494d9a5f6a" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.631967 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-v7xbt" Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.694482 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442"] Mar 19 11:00:04 crc kubenswrapper[4835]: I0319 11:00:04.711181 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-8v442"] Mar 19 11:00:06 crc kubenswrapper[4835]: I0319 11:00:06.416959 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6" path="/var/lib/kubelet/pods/2b1d6b8a-418d-4c44-9ba7-561d1b6cb7d6/volumes" Mar 19 11:00:10 crc kubenswrapper[4835]: I0319 11:00:10.708051 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" event={"ID":"4ab05505-ce46-49e9-94b9-231d72500c38","Type":"ContainerStarted","Data":"95d4d5e8350a1e1f9c586a961ec86824bb0daefab6e650c70ff589764b10e98a"} Mar 19 11:00:10 crc kubenswrapper[4835]: I0319 11:00:10.736283 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" podStartSLOduration=1.909130977 podStartE2EDuration="10.736255658s" podCreationTimestamp="2026-03-19 11:00:00 +0000 UTC" firstStartedPulling="2026-03-19 11:00:01.14404282 +0000 UTC m=+5855.992641407" lastFinishedPulling="2026-03-19 11:00:09.971167501 +0000 UTC m=+5864.819766088" observedRunningTime="2026-03-19 11:00:10.722961286 +0000 UTC m=+5865.571559883" watchObservedRunningTime="2026-03-19 11:00:10.736255658 +0000 UTC m=+5865.584854245" Mar 19 11:00:11 crc kubenswrapper[4835]: I0319 11:00:11.723888 4835 generic.go:334] "Generic (PLEG): container finished" podID="4ab05505-ce46-49e9-94b9-231d72500c38" containerID="95d4d5e8350a1e1f9c586a961ec86824bb0daefab6e650c70ff589764b10e98a" exitCode=0 Mar 19 11:00:11 crc kubenswrapper[4835]: I0319 11:00:11.723943 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" event={"ID":"4ab05505-ce46-49e9-94b9-231d72500c38","Type":"ContainerDied","Data":"95d4d5e8350a1e1f9c586a961ec86824bb0daefab6e650c70ff589764b10e98a"} Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.145644 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.266657 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtlj2\" (UniqueName: \"kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2\") pod \"4ab05505-ce46-49e9-94b9-231d72500c38\" (UID: \"4ab05505-ce46-49e9-94b9-231d72500c38\") " Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.289937 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2" (OuterVolumeSpecName: "kube-api-access-qtlj2") pod "4ab05505-ce46-49e9-94b9-231d72500c38" (UID: "4ab05505-ce46-49e9-94b9-231d72500c38"). InnerVolumeSpecName "kube-api-access-qtlj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.369552 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtlj2\" (UniqueName: \"kubernetes.io/projected/4ab05505-ce46-49e9-94b9-231d72500c38-kube-api-access-qtlj2\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.751424 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" event={"ID":"4ab05505-ce46-49e9-94b9-231d72500c38","Type":"ContainerDied","Data":"072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717"} Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.751472 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072d45dc25575c5db04855ab43c6de260bcee0793e12bc5cfc393cca86119717" Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.751531 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-nsrbc" Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.808448 4835 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-rr6l2"] Mar 19 11:00:13 crc kubenswrapper[4835]: I0319 11:00:13.822088 4835 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-rr6l2"] Mar 19 11:00:14 crc kubenswrapper[4835]: I0319 11:00:14.419861 4835 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffb3e93-cb55-4408-bdd3-716858ecf5df" path="/var/lib/kubelet/pods/4ffb3e93-cb55-4408-bdd3-716858ecf5df/volumes" Mar 19 11:00:16 crc kubenswrapper[4835]: I0319 11:00:16.409242 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:00:16 crc kubenswrapper[4835]: E0319 11:00:16.410456 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:00:23 crc kubenswrapper[4835]: I0319 11:00:23.612955 4835 scope.go:117] "RemoveContainer" containerID="421b4a50b74a670de6d8ff4c73fb4cdece4bc0bc83715add4dd92e0d94a71e9a" Mar 19 11:00:23 crc kubenswrapper[4835]: I0319 11:00:23.646857 4835 scope.go:117] "RemoveContainer" containerID="2978051d796889d0e8eb4c2ea777adca7c927405da249c7cb178b82b6f5a44f0" Mar 19 11:00:23 crc kubenswrapper[4835]: I0319 11:00:23.719801 4835 scope.go:117] "RemoveContainer" containerID="6e8e2c8cf7c57d842da8280b247c5322ba0912a94c7642297556938940a5f816" Mar 19 11:00:23 crc kubenswrapper[4835]: I0319 11:00:23.807734 4835 scope.go:117] "RemoveContainer" containerID="887c71a0c50755c1c2c5bc9ae0e6a404a3f50cc6c8492f90ccd210dc9a8865a7" Mar 19 11:00:28 crc kubenswrapper[4835]: I0319 11:00:28.402423 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:00:28 crc kubenswrapper[4835]: E0319 11:00:28.403537 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:00:39 crc kubenswrapper[4835]: I0319 11:00:39.402554 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:00:39 crc kubenswrapper[4835]: E0319 11:00:39.404056 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:00:51 crc kubenswrapper[4835]: I0319 11:00:51.402675 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:00:51 crc kubenswrapper[4835]: E0319 11:00:51.403728 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.163490 4835 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565301-b8cmv"] Mar 19 11:01:00 crc kubenswrapper[4835]: E0319 11:01:00.164525 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab05505-ce46-49e9-94b9-231d72500c38" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.164539 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab05505-ce46-49e9-94b9-231d72500c38" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4835]: E0319 11:01:00.164571 4835 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b89ad1-470b-461d-88f4-aa9886535e32" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.164577 4835 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b89ad1-470b-461d-88f4-aa9886535e32" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.164835 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab05505-ce46-49e9-94b9-231d72500c38" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.164850 4835 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b89ad1-470b-461d-88f4-aa9886535e32" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.165657 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.175152 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-b8cmv"] Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.199064 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqmv\" (UniqueName: \"kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.199516 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.200153 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.200846 4835 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.303316 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqmv\" (UniqueName: \"kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.303383 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.303500 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.303557 4835 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.310447 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.310480 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.313697 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.325891 4835 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqmv\" (UniqueName: \"kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv\") pod \"keystone-cron-29565301-b8cmv\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:00 crc kubenswrapper[4835]: I0319 11:01:00.489002 4835 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:01 crc kubenswrapper[4835]: I0319 11:01:00.998544 4835 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-b8cmv"] Mar 19 11:01:01 crc kubenswrapper[4835]: I0319 11:01:01.429693 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-b8cmv" event={"ID":"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45","Type":"ContainerStarted","Data":"02c9d1c2fe8f65e281309e788ab750ec4efd46cd9e30672ad3aa17a4cce22a99"} Mar 19 11:01:01 crc kubenswrapper[4835]: I0319 11:01:01.430060 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-b8cmv" event={"ID":"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45","Type":"ContainerStarted","Data":"d112d4d8a36de51f905c086bda9790971bf2fd24b21645291d252b3bf438c754"} Mar 19 11:01:01 crc kubenswrapper[4835]: I0319 11:01:01.452767 4835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565301-b8cmv" podStartSLOduration=1.452728113 podStartE2EDuration="1.452728113s" podCreationTimestamp="2026-03-19 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:01:01.448327364 +0000 UTC m=+5916.296925971" watchObservedRunningTime="2026-03-19 11:01:01.452728113 +0000 UTC m=+5916.301326690" Mar 19 11:01:02 crc kubenswrapper[4835]: I0319 11:01:02.402364 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:01:02 crc kubenswrapper[4835]: E0319 11:01:02.403077 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:01:05 crc kubenswrapper[4835]: I0319 11:01:05.478136 4835 generic.go:334] "Generic (PLEG): container finished" podID="229bcc51-fc0f-4008-9d9b-d50fb2cd1c45" containerID="02c9d1c2fe8f65e281309e788ab750ec4efd46cd9e30672ad3aa17a4cce22a99" exitCode=0 Mar 19 11:01:05 crc kubenswrapper[4835]: I0319 11:01:05.478204 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-b8cmv" event={"ID":"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45","Type":"ContainerDied","Data":"02c9d1c2fe8f65e281309e788ab750ec4efd46cd9e30672ad3aa17a4cce22a99"} Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.002671 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.117990 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqmv\" (UniqueName: \"kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv\") pod \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.118064 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys\") pod \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.118184 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle\") pod \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.118208 4835 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data\") pod \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\" (UID: \"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45\") " Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.135313 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45" (UID: "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.135446 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv" (OuterVolumeSpecName: "kube-api-access-ppqmv") pod "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45" (UID: "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45"). InnerVolumeSpecName "kube-api-access-ppqmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.159872 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45" (UID: "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.210641 4835 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data" (OuterVolumeSpecName: "config-data") pod "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45" (UID: "229bcc51-fc0f-4008-9d9b-d50fb2cd1c45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.221081 4835 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppqmv\" (UniqueName: \"kubernetes.io/projected/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-kube-api-access-ppqmv\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.221113 4835 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.221124 4835 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.221133 4835 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/229bcc51-fc0f-4008-9d9b-d50fb2cd1c45-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.649920 4835 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-b8cmv" event={"ID":"229bcc51-fc0f-4008-9d9b-d50fb2cd1c45","Type":"ContainerDied","Data":"d112d4d8a36de51f905c086bda9790971bf2fd24b21645291d252b3bf438c754"} Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.650294 4835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d112d4d8a36de51f905c086bda9790971bf2fd24b21645291d252b3bf438c754" Mar 19 11:01:07 crc kubenswrapper[4835]: I0319 11:01:07.650026 4835 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-b8cmv" Mar 19 11:01:15 crc kubenswrapper[4835]: I0319 11:01:15.403282 4835 scope.go:117] "RemoveContainer" containerID="bca2c33ce268fd8b5411d31a88904c02008726df361c229dfafcadd5783c9508" Mar 19 11:01:15 crc kubenswrapper[4835]: E0319 11:01:15.407644 4835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bk84k_openshift-machine-config-operator(adf367e5-fedd-4d9e-a7af-345df1f08353)\"" pod="openshift-machine-config-operator/machine-config-daemon-bk84k" podUID="adf367e5-fedd-4d9e-a7af-345df1f08353" Mar 19 11:01:23 crc kubenswrapper[4835]: I0319 11:01:23.920840 4835 scope.go:117] "RemoveContainer" containerID="a2bf8fc56becbc534abcfa2ac1e6d6a1d8a78d09141aa07ee52c862e2c69abd8"